Mock Version: 2.12 Mock Version: 2.12 Mock Version: 2.12 ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --target ppc64le --nodeps /builddir/build/SPECS/pythran.spec'], chrootPath='/var/lib/mock/f36-build-side-48712-31618997-4329373/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=201600uid=1000gid=425user='mockbuild'nspawn_args=[]unshare_net=TrueprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --target ppc64le --nodeps /builddir/build/SPECS/pythran.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False warning: Macro expanded in comment on line 23: %{url}/archive/%{version}/%{name}-%{version}.tar.gz Building target platforms: ppc64le Building for target ppc64le setting SOURCE_DATE_EPOCH=1631836800 Wrote: /builddir/build/SRPMS/pythran-0.11.0-0.fc36.src.rpm Child return code was: 0 ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -br --target ppc64le --nodeps /builddir/build/SPECS/pythran.spec'], chrootPath='/var/lib/mock/f36-build-side-48712-31618997-4329373/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=201600uid=1000gid=425user='mockbuild'nspawn_args=[]unshare_net=TrueraiseExc=FalseprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -br --target ppc64le --nodeps /builddir/build/SPECS/pythran.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False warning: Macro expanded in comment on line 23: %{url}/archive/%{version}/%{name}-%{version}.tar.gz Building target platforms: ppc64le Building for target ppc64le setting SOURCE_DATE_EPOCH=1631836800 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.29bR5K + umask 022 + cd /builddir/build/BUILD + cd /builddir/build/BUILD + rm -rf pythran-feature-0.11.0 + /usr/bin/gzip -dc /builddir/build/SOURCES/0.11.0.tar.gz + /usr/bin/tar -xof - + STATUS=0 + '[' 0 -ne 0 ']' + cd pythran-feature-0.11.0 + /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w . + find -name '*.hpp' -exec chmod -x '{}' + + sed -i '1{/#!/d}' pythran/run.py + rm -r third_party/boost third_party/xsimd + cat + sed -i 's|blas=blas|blas=openblas|' pythran/pythran-linux.cfg pythran/pythran-linux2.cfg + sed -i 's|libs=|libs=flexiblas|' pythran/pythran-linux.cfg pythran/pythran-linux2.cfg + sed -i 's|include_dirs=|include_dirs=/usr/include/flexiblas|' pythran/pythran-linux.cfg pythran/pythran-linux2.cfg + sed -i /guzzle_sphinx_theme/d docs/conf.py docs/requirements.txt + sed -i -e s/-O0/-O1/g -e s/-Werror/-w/g pythran/tests/__init__.py + RPM_EC=0 ++ jobs -p + exit 0 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.Imp2od + umask 022 + cd /builddir/build/BUILD + cd pythran-feature-0.11.0 + echo python3-devel + echo 'python3dist(pip) >= 19' + echo 'python3dist(packaging)' + '[' -f pyproject.toml ']' + '[' -f setup.py ']' + echo 'python3dist(setuptools) >= 40.8' + echo 'python3dist(wheel)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + RPM_TOXENV=py310 + HOSTNAME=rpmbuild + /usr/bin/python3 -s /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 -x doc Handling setuptools >= 40.8 from default build backend Requirement satisfied: setuptools >= 40.8 (installed: setuptools 58.5.3) Handling wheel from default build backend Requirement not satisfied: wheel Exiting dependency generation pass: build backend + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/pythran-0.11.0-0.fc36.buildreqs.nosrc.rpm Child return code was: 11 Dynamic buildrequires detected Going to install missing buildrequires. See root.log for details. ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -br --target ppc64le --nodeps /builddir/build/SPECS/pythran.spec'], chrootPath='/var/lib/mock/f36-build-side-48712-31618997-4329373/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=201600uid=1000gid=425user='mockbuild'nspawn_args=[]unshare_net=TrueraiseExc=FalseprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -br --target ppc64le --nodeps /builddir/build/SPECS/pythran.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False warning: Macro expanded in comment on line 23: %{url}/archive/%{version}/%{name}-%{version}.tar.gz Building target platforms: ppc64le Building for target ppc64le setting SOURCE_DATE_EPOCH=1631836800 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.SiQ2nI + umask 022 + cd /builddir/build/BUILD + cd /builddir/build/BUILD + rm -rf pythran-feature-0.11.0 + /usr/bin/gzip -dc /builddir/build/SOURCES/0.11.0.tar.gz + /usr/bin/tar -xof - + STATUS=0 + '[' 0 -ne 0 ']' + cd pythran-feature-0.11.0 + /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w . + find -name '*.hpp' -exec chmod -x '{}' + + sed -i '1{/#!/d}' pythran/run.py + rm -r third_party/boost third_party/xsimd + cat + sed -i 's|blas=blas|blas=openblas|' pythran/pythran-linux.cfg pythran/pythran-linux2.cfg + sed -i 's|libs=|libs=flexiblas|' pythran/pythran-linux.cfg pythran/pythran-linux2.cfg + sed -i 's|include_dirs=|include_dirs=/usr/include/flexiblas|' pythran/pythran-linux.cfg pythran/pythran-linux2.cfg + sed -i /guzzle_sphinx_theme/d docs/conf.py docs/requirements.txt + sed -i -e s/-O0/-O1/g -e s/-Werror/-w/g pythran/tests/__init__.py + RPM_EC=0 ++ jobs -p + exit 0 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.ltjlvk + umask 022 + cd /builddir/build/BUILD + cd pythran-feature-0.11.0 + echo python3-devel + echo 'python3dist(pip) >= 19' + echo 'python3dist(packaging)' + '[' -f pyproject.toml ']' + '[' -f setup.py ']' + echo 'python3dist(setuptools) >= 40.8' + echo 'python3dist(wheel)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + RPM_TOXENV=py310 + HOSTNAME=rpmbuild + /usr/bin/python3 -s /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 -x doc Handling setuptools >= 40.8 from default build backend Requirement satisfied: setuptools >= 40.8 (installed: setuptools 58.5.3) Handling wheel from default build backend Requirement satisfied: wheel (installed: wheel 0.37.0) package init file 'pythran/pythonic/__init__.py' not found (or not a regular file) warning: no files found matching '*' under directory 'third_party' HOOK STDOUT: running egg_info HOOK STDOUT: creating pythran.egg-info HOOK STDOUT: writing pythran.egg-info/PKG-INFO HOOK STDOUT: writing dependency_links to pythran.egg-info/dependency_links.txt HOOK STDOUT: writing entry points to pythran.egg-info/entry_points.txt HOOK STDOUT: writing requirements to pythran.egg-info/requires.txt HOOK STDOUT: writing top-level names to pythran.egg-info/top_level.txt HOOK STDOUT: writing manifest file 'pythran.egg-info/SOURCES.txt' HOOK STDOUT: reading manifest file 'pythran.egg-info/SOURCES.txt' HOOK STDOUT: reading manifest template 'MANIFEST.in' HOOK STDOUT: adding license file 'LICENSE' HOOK STDOUT: adding license file 'AUTHORS' HOOK STDOUT: writing manifest file 'pythran.egg-info/SOURCES.txt' Handling wheel from get_requires_for_build_wheel Requirement satisfied: wheel (installed: wheel 0.37.0) package init file 'pythran/pythonic/__init__.py' not found (or not a regular file) warning: no files found matching '*' under directory 'third_party' HOOK STDOUT: running dist_info HOOK STDOUT: writing pythran.egg-info/PKG-INFO HOOK STDOUT: writing dependency_links to pythran.egg-info/dependency_links.txt HOOK STDOUT: writing entry points to pythran.egg-info/entry_points.txt HOOK STDOUT: writing requirements to pythran.egg-info/requires.txt HOOK STDOUT: writing top-level names to pythran.egg-info/top_level.txt HOOK STDOUT: reading manifest file 'pythran.egg-info/SOURCES.txt' HOOK STDOUT: reading manifest template 'MANIFEST.in' HOOK STDOUT: adding license file 'LICENSE' HOOK STDOUT: adding license file 'AUTHORS' HOOK STDOUT: writing manifest file 'pythran.egg-info/SOURCES.txt' HOOK STDOUT: creating '/builddir/build/BUILD/pythran-feature-0.11.0/pythran.dist-info' HOOK STDOUT: adding license file "LICENSE" (matched pattern "LICEN[CS]E*") HOOK STDOUT: adding license file "AUTHORS" (matched pattern "AUTHORS*") Handling ply (>=3.4) from wheel metadata: Requires-Dist Requirement not satisfied: ply (>=3.4) Handling gast (~=0.5.0) from wheel metadata: Requires-Dist Requirement not satisfied: gast (~=0.5.0) Handling numpy from wheel metadata: Requires-Dist Requirement satisfied: numpy (installed: numpy 1.21.1) Handling beniget (~=0.4.0) from wheel metadata: Requires-Dist Requirement not satisfied: beniget (~=0.4.0) Handling numpy ; extra == 'doc' from wheel metadata: Requires-Dist Requirement satisfied: numpy ; extra == 'doc' (installed: numpy 1.21.1) Handling nbsphinx ; extra == 'doc' from wheel metadata: Requires-Dist Requirement not satisfied: nbsphinx ; extra == 'doc' Handling scipy ; extra == 'doc' from wheel metadata: Requires-Dist Requirement satisfied: scipy ; extra == 'doc' (installed: scipy 1.7.0) + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/pythran-0.11.0-0.fc36.buildreqs.nosrc.rpm Child return code was: 11 Dynamic buildrequires detected Going to install missing buildrequires. See root.log for details. ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -br --target ppc64le --nodeps /builddir/build/SPECS/pythran.spec'], chrootPath='/var/lib/mock/f36-build-side-48712-31618997-4329373/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=201600uid=1000gid=425user='mockbuild'nspawn_args=[]unshare_net=TrueraiseExc=FalseprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -br --target ppc64le --nodeps /builddir/build/SPECS/pythran.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False warning: Macro expanded in comment on line 23: %{url}/archive/%{version}/%{name}-%{version}.tar.gz Building target platforms: ppc64le Building for target ppc64le setting SOURCE_DATE_EPOCH=1631836800 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.PBrlho + umask 022 + cd /builddir/build/BUILD + cd /builddir/build/BUILD + rm -rf pythran-feature-0.11.0 + /usr/bin/tar -xof - + /usr/bin/gzip -dc /builddir/build/SOURCES/0.11.0.tar.gz + STATUS=0 + '[' 0 -ne 0 ']' + cd pythran-feature-0.11.0 + /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w . + find -name '*.hpp' -exec chmod -x '{}' + + sed -i '1{/#!/d}' pythran/run.py + rm -r third_party/boost third_party/xsimd + cat + sed -i 's|blas=blas|blas=openblas|' pythran/pythran-linux.cfg pythran/pythran-linux2.cfg + sed -i 's|libs=|libs=flexiblas|' pythran/pythran-linux.cfg pythran/pythran-linux2.cfg + sed -i 's|include_dirs=|include_dirs=/usr/include/flexiblas|' pythran/pythran-linux.cfg pythran/pythran-linux2.cfg + sed -i /guzzle_sphinx_theme/d docs/conf.py docs/requirements.txt + sed -i -e s/-O0/-O1/g -e s/-Werror/-w/g pythran/tests/__init__.py + RPM_EC=0 ++ jobs -p + exit 0 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.waFAIV + umask 022 + cd /builddir/build/BUILD + cd pythran-feature-0.11.0 + echo python3-devel + echo 'python3dist(pip) >= 19' + echo 'python3dist(packaging)' + '[' -f pyproject.toml ']' + '[' -f setup.py ']' + echo 'python3dist(setuptools) >= 40.8' + echo 'python3dist(wheel)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + RPM_TOXENV=py310 + HOSTNAME=rpmbuild + /usr/bin/python3 -s /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 -x doc Handling setuptools >= 40.8 from default build backend Requirement satisfied: setuptools >= 40.8 (installed: setuptools 58.5.3) Handling wheel from default build backend Requirement satisfied: wheel (installed: wheel 0.37.0) package init file 'pythran/pythonic/__init__.py' not found (or not a regular file) warning: no files found matching '*' under directory 'third_party' HOOK STDOUT: running egg_info HOOK STDOUT: creating pythran.egg-info HOOK STDOUT: writing pythran.egg-info/PKG-INFO HOOK STDOUT: writing dependency_links to pythran.egg-info/dependency_links.txt HOOK STDOUT: writing entry points to pythran.egg-info/entry_points.txt HOOK STDOUT: writing requirements to pythran.egg-info/requires.txt HOOK STDOUT: writing top-level names to pythran.egg-info/top_level.txt HOOK STDOUT: writing manifest file 'pythran.egg-info/SOURCES.txt' HOOK STDOUT: reading manifest file 'pythran.egg-info/SOURCES.txt' HOOK STDOUT: reading manifest template 'MANIFEST.in' HOOK STDOUT: adding license file 'LICENSE' HOOK STDOUT: adding license file 'AUTHORS' HOOK STDOUT: writing manifest file 'pythran.egg-info/SOURCES.txt' Handling wheel from get_requires_for_build_wheel Requirement satisfied: wheel (installed: wheel 0.37.0) package init file 'pythran/pythonic/__init__.py' not found (or not a regular file) warning: no files found matching '*' under directory 'third_party' HOOK STDOUT: running dist_info HOOK STDOUT: writing pythran.egg-info/PKG-INFO HOOK STDOUT: writing dependency_links to pythran.egg-info/dependency_links.txt HOOK STDOUT: writing entry points to pythran.egg-info/entry_points.txt HOOK STDOUT: writing requirements to pythran.egg-info/requires.txt HOOK STDOUT: writing top-level names to pythran.egg-info/top_level.txt HOOK STDOUT: reading manifest file 'pythran.egg-info/SOURCES.txt' HOOK STDOUT: reading manifest template 'MANIFEST.in' HOOK STDOUT: adding license file 'LICENSE' HOOK STDOUT: adding license file 'AUTHORS' HOOK STDOUT: writing manifest file 'pythran.egg-info/SOURCES.txt' HOOK STDOUT: creating '/builddir/build/BUILD/pythran-feature-0.11.0/pythran.dist-info' HOOK STDOUT: adding license file "LICENSE" (matched pattern "LICEN[CS]E*") HOOK STDOUT: adding license file "AUTHORS" (matched pattern "AUTHORS*") Handling ply (>=3.4) from wheel metadata: Requires-Dist Requirement satisfied: ply (>=3.4) (installed: ply 3.11) Handling gast (~=0.5.0) from wheel metadata: Requires-Dist Requirement satisfied: gast (~=0.5.0) (installed: gast 0.5.3) Handling numpy from wheel metadata: Requires-Dist Requirement satisfied: numpy (installed: numpy 1.21.1) Handling beniget (~=0.4.0) from wheel metadata: Requires-Dist Requirement satisfied: beniget (~=0.4.0) (installed: beniget 0.4.1) Handling numpy ; extra == 'doc' from wheel metadata: Requires-Dist Requirement satisfied: numpy ; extra == 'doc' (installed: numpy 1.21.1) Handling nbsphinx ; extra == 'doc' from wheel metadata: Requires-Dist Requirement satisfied: nbsphinx ; extra == 'doc' (installed: nbsphinx 0.8.7) Handling scipy ; extra == 'doc' from wheel metadata: Requires-Dist Requirement satisfied: scipy ; extra == 'doc' (installed: scipy 1.7.0) + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/pythran-0.11.0-0.fc36.buildreqs.nosrc.rpm Child return code was: 11 Dynamic buildrequires detected Going to install missing buildrequires. See root.log for details. ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -ba --noprep --target ppc64le --nodeps /builddir/build/SPECS/pythran.spec'], chrootPath='/var/lib/mock/f36-build-side-48712-31618997-4329373/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=201600uid=1000gid=425user='mockbuild'nspawn_args=[]unshare_net=TrueprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -ba --noprep --target ppc64le --nodeps /builddir/build/SPECS/pythran.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False warning: Macro expanded in comment on line 23: %{url}/archive/%{version}/%{name}-%{version}.tar.gz Building target platforms: ppc64le Building for target ppc64le setting SOURCE_DATE_EPOCH=1631836800 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.EJR6Se + umask 022 + cd /builddir/build/BUILD + cd pythran-feature-0.11.0 + echo python3-devel + echo 'python3dist(pip) >= 19' + echo 'python3dist(packaging)' + '[' -f pyproject.toml ']' + '[' -f setup.py ']' + echo 'python3dist(setuptools) >= 40.8' + echo 'python3dist(wheel)' + rm -rfv pythran.dist-info/ removed 'pythran.dist-info/entry_points.txt' removed 'pythran.dist-info/top_level.txt' removed 'pythran.dist-info/METADATA' removed 'pythran.dist-info/LICENSE' removed 'pythran.dist-info/AUTHORS' removed directory 'pythran.dist-info/' + '[' -f /usr/bin/python3 ']' + RPM_TOXENV=py310 + HOSTNAME=rpmbuild + /usr/bin/python3 -s /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 -x doc Handling setuptools >= 40.8 from default build backend Requirement satisfied: setuptools >= 40.8 (installed: setuptools 58.5.3) Handling wheel from default build backend Requirement satisfied: wheel (installed: wheel 0.37.0) package init file 'pythran/pythonic/__init__.py' not found (or not a regular file) warning: no files found matching '*' under directory 'third_party' HOOK STDOUT: running egg_info HOOK STDOUT: creating pythran.egg-info HOOK STDOUT: writing pythran.egg-info/PKG-INFO HOOK STDOUT: writing dependency_links to pythran.egg-info/dependency_links.txt HOOK STDOUT: writing entry points to pythran.egg-info/entry_points.txt HOOK STDOUT: writing requirements to pythran.egg-info/requires.txt HOOK STDOUT: writing top-level names to pythran.egg-info/top_level.txt HOOK STDOUT: writing manifest file 'pythran.egg-info/SOURCES.txt' HOOK STDOUT: reading manifest file 'pythran.egg-info/SOURCES.txt' HOOK STDOUT: reading manifest template 'MANIFEST.in' HOOK STDOUT: adding license file 'LICENSE' HOOK STDOUT: adding license file 'AUTHORS' HOOK STDOUT: writing manifest file 'pythran.egg-info/SOURCES.txt' Handling wheel from get_requires_for_build_wheel Requirement satisfied: wheel (installed: wheel 0.37.0) package init file 'pythran/pythonic/__init__.py' not found (or not a regular file) warning: no files found matching '*' under directory 'third_party' HOOK STDOUT: running dist_info HOOK STDOUT: writing pythran.egg-info/PKG-INFO HOOK STDOUT: writing dependency_links to pythran.egg-info/dependency_links.txt HOOK STDOUT: writing entry points to pythran.egg-info/entry_points.txt HOOK STDOUT: writing requirements to pythran.egg-info/requires.txt HOOK STDOUT: writing top-level names to pythran.egg-info/top_level.txt HOOK STDOUT: reading manifest file 'pythran.egg-info/SOURCES.txt' HOOK STDOUT: reading manifest template 'MANIFEST.in' HOOK STDOUT: adding license file 'LICENSE' HOOK STDOUT: adding license file 'AUTHORS' HOOK STDOUT: writing manifest file 'pythran.egg-info/SOURCES.txt' HOOK STDOUT: creating '/builddir/build/BUILD/pythran-feature-0.11.0/pythran.dist-info' HOOK STDOUT: adding license file "LICENSE" (matched pattern "LICEN[CS]E*") HOOK STDOUT: adding license file "AUTHORS" (matched pattern "AUTHORS*") Handling ply (>=3.4) from wheel metadata: Requires-Dist Requirement satisfied: ply (>=3.4) (installed: ply 3.11) Handling gast (~=0.5.0) from wheel metadata: Requires-Dist Requirement satisfied: gast (~=0.5.0) (installed: gast 0.5.3) Handling numpy from wheel metadata: Requires-Dist Requirement satisfied: numpy (installed: numpy 1.21.1) Handling beniget (~=0.4.0) from wheel metadata: Requires-Dist Requirement satisfied: beniget (~=0.4.0) (installed: beniget 0.4.1) Handling numpy ; extra == 'doc' from wheel metadata: Requires-Dist Requirement satisfied: numpy ; extra == 'doc' (installed: numpy 1.21.1) Handling nbsphinx ; extra == 'doc' from wheel metadata: Requires-Dist Requirement satisfied: nbsphinx ; extra == 'doc' (installed: nbsphinx 0.8.7) Handling scipy ; extra == 'doc' from wheel metadata: Requires-Dist Requirement satisfied: scipy ; extra == 'doc' (installed: scipy 1.7.0) + RPM_EC=0 ++ jobs -p + exit 0 Executing(%build): /bin/sh -e /var/tmp/rpm-tmp.ftIXQj + umask 022 + cd /builddir/build/BUILD + cd pythran-feature-0.11.0 + mkdir -p /builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 ' + TMPDIR=/builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir + /usr/bin/python3 -m pip wheel --wheel-dir /builddir/build/BUILD/pythran-feature-0.11.0/pyproject-wheeldir --no-deps --use-pep517 --no-build-isolation --disable-pip-version-check --no-clean --progress-bar off --verbose . Processing /builddir/build/BUILD/pythran-feature-0.11.0 Preparing metadata (pyproject.toml): started Running command /usr/bin/python3 /usr/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py prepare_metadata_for_build_wheel /builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir/tmpalndvo_9 running dist_info creating /builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir/pip-modern-metadata-_l55ohb9/pythran.egg-info writing /builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir/pip-modern-metadata-_l55ohb9/pythran.egg-info/PKG-INFO writing dependency_links to /builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir/pip-modern-metadata-_l55ohb9/pythran.egg-info/dependency_links.txt writing entry points to /builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir/pip-modern-metadata-_l55ohb9/pythran.egg-info/entry_points.txt writing requirements to /builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir/pip-modern-metadata-_l55ohb9/pythran.egg-info/requires.txt writing top-level names to /builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir/pip-modern-metadata-_l55ohb9/pythran.egg-info/top_level.txt writing manifest file '/builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir/pip-modern-metadata-_l55ohb9/pythran.egg-info/SOURCES.txt' package init file 'pythran/pythonic/__init__.py' not found (or not a regular file) reading manifest file '/builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir/pip-modern-metadata-_l55ohb9/pythran.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no files found matching '*' under directory 'third_party' adding license file 'LICENSE' adding license file 'AUTHORS' writing manifest file '/builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir/pip-modern-metadata-_l55ohb9/pythran.egg-info/SOURCES.txt' creating '/builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir/pip-modern-metadata-_l55ohb9/pythran.dist-info' adding license file "LICENSE" (matched pattern "LICEN[CS]E*") adding license file "AUTHORS" (matched pattern "AUTHORS*") Preparing metadata (pyproject.toml): finished with status 'done' Building wheels for collected packages: pythran Building wheel for pythran (pyproject.toml): started Running command /usr/bin/python3 /usr/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py build_wheel /builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir/tmp2vqugja2 running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/pythran copying pythran/__init__.py -> build/lib/pythran copying pythran/backend.py -> build/lib/pythran copying pythran/config.py -> build/lib/pythran copying pythran/conversion.py -> build/lib/pythran copying pythran/cxxgen.py -> build/lib/pythran copying pythran/cxxtypes.py -> build/lib/pythran copying pythran/dist.py -> build/lib/pythran copying pythran/errors.py -> build/lib/pythran copying pythran/frontend.py -> build/lib/pythran copying pythran/graph.py -> build/lib/pythran copying pythran/interval.py -> build/lib/pythran copying pythran/intrinsic.py -> build/lib/pythran copying pythran/log.py -> build/lib/pythran copying pythran/magic.py -> build/lib/pythran copying pythran/metadata.py -> build/lib/pythran copying pythran/middlend.py -> build/lib/pythran copying pythran/openmp.py -> build/lib/pythran copying pythran/passmanager.py -> build/lib/pythran copying pythran/run.py -> build/lib/pythran copying pythran/spec.py -> build/lib/pythran copying pythran/syntax.py -> build/lib/pythran copying pythran/tables.py -> build/lib/pythran copying pythran/toolchain.py -> build/lib/pythran copying pythran/typing.py -> build/lib/pythran copying pythran/unparse.py -> build/lib/pythran copying pythran/utils.py -> build/lib/pythran copying pythran/version.py -> build/lib/pythran creating build/lib/pythran/analyses copying pythran/analyses/__init__.py -> build/lib/pythran/analyses copying pythran/analyses/aliases.py -> build/lib/pythran/analyses copying pythran/analyses/ancestors.py -> build/lib/pythran/analyses copying pythran/analyses/argument_effects.py -> build/lib/pythran/analyses copying pythran/analyses/argument_read_once.py -> build/lib/pythran/analyses copying pythran/analyses/ast_matcher.py -> build/lib/pythran/analyses copying pythran/analyses/cfg.py -> build/lib/pythran/analyses copying pythran/analyses/constant_expressions.py -> build/lib/pythran/analyses copying pythran/analyses/dependencies.py -> build/lib/pythran/analyses copying pythran/analyses/extended_syntax_check.py -> build/lib/pythran/analyses copying pythran/analyses/fixed_size_list.py -> build/lib/pythran/analyses copying pythran/analyses/global_declarations.py -> build/lib/pythran/analyses copying pythran/analyses/global_effects.py -> build/lib/pythran/analyses copying pythran/analyses/globals_analysis.py -> build/lib/pythran/analyses copying pythran/analyses/has_return.py -> build/lib/pythran/analyses copying pythran/analyses/identifiers.py -> build/lib/pythran/analyses copying pythran/analyses/immediates.py -> build/lib/pythran/analyses copying pythran/analyses/imported_ids.py -> build/lib/pythran/analyses copying pythran/analyses/inlinable.py -> build/lib/pythran/analyses copying pythran/analyses/is_assigned.py -> build/lib/pythran/analyses copying pythran/analyses/lazyness_analysis.py -> build/lib/pythran/analyses copying pythran/analyses/literals.py -> build/lib/pythran/analyses copying pythran/analyses/local_declarations.py -> build/lib/pythran/analyses copying pythran/analyses/locals_analysis.py -> build/lib/pythran/analyses copying pythran/analyses/node_count.py -> build/lib/pythran/analyses copying pythran/analyses/optimizable_comprehension.py -> build/lib/pythran/analyses copying pythran/analyses/ordered_global_declarations.py -> build/lib/pythran/analyses copying pythran/analyses/parallel_maps.py -> build/lib/pythran/analyses copying pythran/analyses/potential_iterator.py -> build/lib/pythran/analyses copying pythran/analyses/pure_expressions.py -> build/lib/pythran/analyses copying pythran/analyses/range_values.py -> build/lib/pythran/analyses copying pythran/analyses/scope.py -> build/lib/pythran/analyses copying pythran/analyses/static_expressions.py -> build/lib/pythran/analyses copying pythran/analyses/use_def_chain.py -> build/lib/pythran/analyses copying pythran/analyses/use_omp.py -> build/lib/pythran/analyses copying pythran/analyses/yield_points.py -> build/lib/pythran/analyses creating build/lib/pythran/transformations copying pythran/transformations/__init__.py -> build/lib/pythran/transformations copying pythran/transformations/expand_builtins.py -> build/lib/pythran/transformations copying pythran/transformations/expand_globals.py -> build/lib/pythran/transformations copying pythran/transformations/expand_import_all.py -> build/lib/pythran/transformations copying pythran/transformations/expand_imports.py -> build/lib/pythran/transformations copying pythran/transformations/extract_doc_strings.py -> build/lib/pythran/transformations copying pythran/transformations/false_polymorphism.py -> build/lib/pythran/transformations copying pythran/transformations/handle_import.py -> build/lib/pythran/transformations copying pythran/transformations/normalize_compare.py -> build/lib/pythran/transformations copying pythran/transformations/normalize_exception.py -> build/lib/pythran/transformations copying pythran/transformations/normalize_ifelse.py -> build/lib/pythran/transformations copying pythran/transformations/normalize_is_none.py -> build/lib/pythran/transformations copying pythran/transformations/normalize_method_calls.py -> build/lib/pythran/transformations copying pythran/transformations/normalize_return.py -> build/lib/pythran/transformations copying pythran/transformations/normalize_static_if.py -> build/lib/pythran/transformations copying pythran/transformations/normalize_tuples.py -> build/lib/pythran/transformations copying pythran/transformations/remove_comprehension.py -> build/lib/pythran/transformations copying pythran/transformations/remove_fstrings.py -> build/lib/pythran/transformations copying pythran/transformations/remove_lambdas.py -> build/lib/pythran/transformations copying pythran/transformations/remove_named_arguments.py -> build/lib/pythran/transformations copying pythran/transformations/remove_nested_functions.py -> build/lib/pythran/transformations copying pythran/transformations/unshadow_parameters.py -> build/lib/pythran/transformations creating build/lib/pythran/optimizations copying pythran/optimizations/__init__.py -> build/lib/pythran/optimizations copying pythran/optimizations/comprehension_patterns.py -> build/lib/pythran/optimizations copying pythran/optimizations/constant_folding.py -> build/lib/pythran/optimizations copying pythran/optimizations/dead_code_elimination.py -> build/lib/pythran/optimizations copying pythran/optimizations/forward_substitution.py -> build/lib/pythran/optimizations copying pythran/optimizations/inline_builtins.py -> build/lib/pythran/optimizations copying pythran/optimizations/inlining.py -> build/lib/pythran/optimizations copying pythran/optimizations/iter_transformation.py -> build/lib/pythran/optimizations copying pythran/optimizations/list_comp_to_genexp.py -> build/lib/pythran/optimizations copying pythran/optimizations/list_to_tuple.py -> build/lib/pythran/optimizations copying pythran/optimizations/loop_full_unrolling.py -> build/lib/pythran/optimizations copying pythran/optimizations/modindex.py -> build/lib/pythran/optimizations copying pythran/optimizations/pattern_transform.py -> build/lib/pythran/optimizations copying pythran/optimizations/range_based_simplify.py -> build/lib/pythran/optimizations copying pythran/optimizations/range_loop_unfolding.py -> build/lib/pythran/optimizations copying pythran/optimizations/remove_dead_functions.py -> build/lib/pythran/optimizations copying pythran/optimizations/simplify_except.py -> build/lib/pythran/optimizations copying pythran/optimizations/square.py -> build/lib/pythran/optimizations copying pythran/optimizations/tuple_to_shape.py -> build/lib/pythran/optimizations creating build/lib/omp copying omp/__init__.py -> build/lib/omp package init file 'pythran/pythonic/__init__.py' not found (or not a regular file) creating build/lib/pythran/types copying pythran/types/__init__.py -> build/lib/pythran/types copying pythran/types/conversion.py -> build/lib/pythran/types copying pythran/types/reorder.py -> build/lib/pythran/types copying pythran/types/signature.py -> build/lib/pythran/types copying pythran/types/tog.py -> build/lib/pythran/types copying pythran/types/type_dependencies.py -> build/lib/pythran/types copying pythran/types/types.py -> build/lib/pythran/types copying pythran/pythran-darwin.cfg -> build/lib/pythran copying pythran/pythran-default.cfg -> build/lib/pythran copying pythran/pythran-linux.cfg -> build/lib/pythran copying pythran/pythran-linux2.cfg -> build/lib/pythran copying pythran/pythran-win32.cfg -> build/lib/pythran copying pythran/pythran.cfg -> build/lib/pythran creating build/lib/pythran/pythonic copying pythran/pythonic/core.hpp -> build/lib/pythran/pythonic creating build/lib/pythran/pythonic/__dispatch__ copying pythran/pythonic/__dispatch__/clear.hpp -> build/lib/pythran/pythonic/__dispatch__ copying pythran/pythonic/__dispatch__/conjugate.hpp -> build/lib/pythran/pythonic/__dispatch__ copying pythran/pythonic/__dispatch__/copy.hpp -> build/lib/pythran/pythonic/__dispatch__ copying pythran/pythonic/__dispatch__/count.hpp -> build/lib/pythran/pythonic/__dispatch__ copying pythran/pythonic/__dispatch__/index.hpp -> build/lib/pythran/pythonic/__dispatch__ copying pythran/pythonic/__dispatch__/pop.hpp -> build/lib/pythran/pythonic/__dispatch__ copying pythran/pythonic/__dispatch__/remove.hpp -> build/lib/pythran/pythonic/__dispatch__ copying pythran/pythonic/__dispatch__/sort.hpp -> build/lib/pythran/pythonic/__dispatch__ copying pythran/pythonic/__dispatch__/update.hpp -> build/lib/pythran/pythonic/__dispatch__ creating build/lib/pythran/pythonic/bisect copying pythran/pythonic/bisect/bisect.hpp -> build/lib/pythran/pythonic/bisect copying pythran/pythonic/bisect/bisect_left.hpp -> build/lib/pythran/pythonic/bisect copying pythran/pythonic/bisect/bisect_right.hpp -> build/lib/pythran/pythonic/bisect creating build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/ArithmeticError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/AssertionError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/AttributeError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/BaseException.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/BufferError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/BytesWarning.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/DeprecationWarning.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/EOFError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/EnvironmentError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/Exception.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/False.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/FileNotFoundError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/FloatingPointError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/FutureWarning.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/GeneratorExit.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/IOError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/ImportError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/ImportWarning.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/IndentationError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/IndexError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/KeyError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/KeyboardInterrupt.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/LookupError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/MemoryError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/NameError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/None.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/NotImplementedError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/OSError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/OverflowError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/PendingDeprecationWarning.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/ReferenceError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/RuntimeError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/RuntimeWarning.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/StopIteration.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/SyntaxError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/SyntaxWarning.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/SystemError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/SystemExit.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/TabError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/True.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/TypeError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/UnboundLocalError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/UnicodeError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/UnicodeWarning.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/UserWarning.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/ValueError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/Warning.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/ZeroDivisionError.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/abs.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/all.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/any.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/assert.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/bin.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/bool_.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/chr.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/complex.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/dict.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/divmod.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/enumerate.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/file.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/filter.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/float_.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/getattr.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/hex.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/id.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/in.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/int_.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/isinstance.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/iter.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/len.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/list.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/map.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/max.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/min.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/minmax.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/next.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/oct.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/open.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/ord.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/pow.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/print.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/range.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/reduce.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/reversed.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/round.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/set.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/slice.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/sorted.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/str.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/sum.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/tuple.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/type.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/xrange.hpp -> build/lib/pythran/pythonic/builtins copying pythran/pythonic/builtins/zip.hpp -> build/lib/pythran/pythonic/builtins creating build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/acos.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/acosh.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/asin.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/asinh.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/atan.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/atanh.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/cos.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/cosh.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/e.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/exp.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/isinf.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/isnan.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/log.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/log10.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/pi.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/sin.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/sinh.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/sqrt.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/tan.hpp -> build/lib/pythran/pythonic/cmath copying pythran/pythonic/cmath/tanh.hpp -> build/lib/pythran/pythonic/cmath creating build/lib/pythran/pythonic/functools copying pythran/pythonic/functools/partial.hpp -> build/lib/pythran/pythonic/functools copying pythran/pythonic/functools/reduce.hpp -> build/lib/pythran/pythonic/functools creating build/lib/pythran/pythonic/itertools copying pythran/pythonic/itertools/combinations.hpp -> build/lib/pythran/pythonic/itertools copying pythran/pythonic/itertools/common.hpp -> build/lib/pythran/pythonic/itertools copying pythran/pythonic/itertools/count.hpp -> build/lib/pythran/pythonic/itertools copying pythran/pythonic/itertools/ifilter.hpp -> build/lib/pythran/pythonic/itertools copying pythran/pythonic/itertools/islice.hpp -> build/lib/pythran/pythonic/itertools copying pythran/pythonic/itertools/permutations.hpp -> build/lib/pythran/pythonic/itertools copying pythran/pythonic/itertools/product.hpp -> build/lib/pythran/pythonic/itertools copying pythran/pythonic/itertools/repeat.hpp -> build/lib/pythran/pythonic/itertools creating build/lib/pythran/pythonic/math copying pythran/pythonic/math/acos.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/acosh.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/asin.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/asinh.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/atan.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/atan2.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/atanh.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/ceil.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/copysign.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/cos.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/cosh.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/degrees.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/e.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/erf.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/erfc.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/exp.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/expm1.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/fabs.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/factorial.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/floor.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/fmod.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/frexp.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/gamma.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/hypot.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/isinf.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/isnan.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/ldexp.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/lgamma.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/log.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/log10.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/log1p.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/modf.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/pi.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/pow.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/radians.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/sin.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/sinh.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/sqrt.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/tan.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/tanh.hpp -> build/lib/pythran/pythonic/math copying pythran/pythonic/math/trunc.hpp -> build/lib/pythran/pythonic/math creating build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/NINF.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/abs.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/absolute.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/add.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/alen.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/all.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/allclose.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/alltrue.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/amax.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/amin.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/angle.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/angle_in_deg.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/angle_in_rad.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/any.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/append.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/arange.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/arccos.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/arccosh.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/arcsin.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/arcsinh.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/arctan.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/arctan2.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/arctanh.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/argmax.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/argmin.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/argminmax.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/argsort.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/argwhere.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/around.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/array.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/array2string.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/array_equal.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/array_equiv.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/array_split.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/array_str.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/asarray.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/asarray_chkfinite.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ascontiguousarray.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/asfarray.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/asscalar.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/atleast_1d.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/atleast_2d.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/atleast_3d.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/average.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/base_repr.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/binary_repr.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/bincount.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/bitwise_and.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/bitwise_not.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/bitwise_or.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/bitwise_xor.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/bool_.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/broadcast_to.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/byte.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/cbrt.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ceil.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/clip.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/complex.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/complex128.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/complex256.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/complex64.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/concatenate.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/conj.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/conjugate.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/convolve.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/copy.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/copysign.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/copyto.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/correlate.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/cos.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/cosh.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/count_nonzero.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/cross.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/cumprod.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/cumproduct.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/cumsum.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/deg2rad.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/degrees.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/delete_.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/diag.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/diagflat.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/diagonal.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/diff.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/digitize.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/divide.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/dot.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/double_.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/e.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ediff1d.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/empty.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/empty_like.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/equal.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/exp.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/expand_dims.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/expm1.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/eye.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/fabs.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/fill_diagonal.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/finfo.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/fix.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/flatnonzero.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/flip.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/fliplr.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/flipud.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/float128.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/float32.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/float64.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/float_.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/floor.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/floor_divide.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/fmax.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/fmin.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/fmod.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/frexp.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/fromfile.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/fromfunction.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/fromiter.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/fromstring.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/full.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/full_like.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/greater.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/greater_equal.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/heaviside.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/hstack.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/hypot.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/identity.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/imag.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/indices.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/inf.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/inner.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/insert.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/int16.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/int32.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/int64.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/int8.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/int_.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/intc.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/interp.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/interp_core.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/intersect1d.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/intp.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/invert.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/isclose.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/iscomplex.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/isfinite.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/isinf.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/isnan.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/isneginf.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/isposinf.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/isreal.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/isrealobj.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/isscalar.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/issctype.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ldexp.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/left_shift.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/less.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/less_equal.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/lexsort.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/linspace.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/log.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/log10.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/log1p.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/log2.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/logaddexp.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/logaddexp2.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/logical_and.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/logical_not.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/logical_or.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/logical_xor.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/logspace.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/longlong.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/max.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/maximum.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/mean.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/median.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/min.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/minimum.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/mod.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/multiply.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/nan.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/nan_to_num.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/nanargmax.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/nanargmin.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/nanmax.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/nanmin.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/nansum.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ndarray.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ndenumerate.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ndim.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ndindex.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/negative.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/newaxis.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/nextafter.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/nonzero.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/not_equal.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ones.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ones_like.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/outer.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/partial_sum.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/pi.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/place.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/power.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/prod.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/product.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ptp.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/put.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/putmask.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/rad2deg.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/radians.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ravel.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/real.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/reciprocal.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/reduce.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/remainder.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/repeat.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/resize.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/right_shift.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/rint.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/roll.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/rollaxis.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/rot90.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/round.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/round_.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/searchsorted.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/select.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/setdiff1d.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/shape.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/short_.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/sign.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/signbit.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/sin.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/sinh.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/size.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/sometrue.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/sort.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/sort_complex.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/spacing.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/split.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/sqrt.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/square.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/stack.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/std_.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/subtract.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/sum.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/swapaxes.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/take.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/tan.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/tanh.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/tile.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/trace.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/transpose.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/tri.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/tril.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/trim_zeros.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/triu.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/true_divide.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/trunc.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ubyte.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ufunc_accumulate.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ufunc_reduce.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/uint.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/uint16.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/uint32.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/uint64.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/uint8.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/uintc.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/uintp.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ulonglong.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/union1d.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/unique.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/unravel_index.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/unwrap.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/ushort.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/var.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/vdot.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/vstack.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/where.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/zeros.hpp -> build/lib/pythran/pythonic/numpy copying pythran/pythonic/numpy/zeros_like.hpp -> build/lib/pythran/pythonic/numpy creating build/lib/pythran/pythonic/omp copying pythran/pythonic/omp/get_num_threads.hpp -> build/lib/pythran/pythonic/omp copying pythran/pythonic/omp/get_thread_num.hpp -> build/lib/pythran/pythonic/omp copying pythran/pythonic/omp/get_wtick.hpp -> build/lib/pythran/pythonic/omp copying pythran/pythonic/omp/get_wtime.hpp -> build/lib/pythran/pythonic/omp copying pythran/pythonic/omp/in_parallel.hpp -> build/lib/pythran/pythonic/omp copying pythran/pythonic/omp/set_nested.hpp -> build/lib/pythran/pythonic/omp creating build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__abs__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__add__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__and__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__concat__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__contains__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__delitem__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__div__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__eq__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__floordiv__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__ge__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__getitem__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__gt__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__iadd__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__iand__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__iconcat__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__idiv__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__ifloordiv__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__ilshift__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__imod__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__imul__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__inv__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__invert__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__ior__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__ipow__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__irshift__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__isub__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__itruediv__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__ixor__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__le__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__lshift__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__lt__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__matmul__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__mod__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__mul__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__ne__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__neg__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__not__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__or__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__pos__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__rshift__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__sub__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__truediv__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/__xor__.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/abs.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/add.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/and_.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/concat.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/contains.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/countOf.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/delitem.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/div.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/eq.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/floordiv.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/ge.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/getitem.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/gt.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/iadd.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/iand.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/icommon.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/iconcat.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/idiv.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/ifloordiv.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/ilshift.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/imatmul.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/imax.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/imin.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/imod.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/imul.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/indexOf.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/inv.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/invert.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/ior.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/ipow.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/irshift.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/is_.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/is_not.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/isub.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/itemgetter.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/itruediv.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/ixor.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/le.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/lshift.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/lt.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/matmul.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/mod.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/mul.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/ne.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/neg.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/not_.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/or_.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/overloads.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/pos.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/pow.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/rshift.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/sub.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/truediv.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/truth.hpp -> build/lib/pythran/pythonic/operator_ copying pythran/pythonic/operator_/xor_.hpp -> build/lib/pythran/pythonic/operator_ creating build/lib/pythran/pythonic/python copying pythran/pythonic/python/core.hpp -> build/lib/pythran/pythonic/python copying pythran/pythonic/python/exception_handler.hpp -> build/lib/pythran/pythonic/python creating build/lib/pythran/pythonic/random copying pythran/pythonic/random/choice.hpp -> build/lib/pythran/pythonic/random copying pythran/pythonic/random/expovariate.hpp -> build/lib/pythran/pythonic/random copying pythran/pythonic/random/gauss.hpp -> build/lib/pythran/pythonic/random copying pythran/pythonic/random/randint.hpp -> build/lib/pythran/pythonic/random copying pythran/pythonic/random/random.hpp -> build/lib/pythran/pythonic/random copying pythran/pythonic/random/randrange.hpp -> build/lib/pythran/pythonic/random copying pythran/pythonic/random/sample.hpp -> build/lib/pythran/pythonic/random copying pythran/pythonic/random/seed.hpp -> build/lib/pythran/pythonic/random copying pythran/pythonic/random/shuffle.hpp -> build/lib/pythran/pythonic/random copying pythran/pythonic/random/uniform.hpp -> build/lib/pythran/pythonic/random creating build/lib/pythran/pythonic/string copying pythran/pythonic/string/ascii_letters.hpp -> build/lib/pythran/pythonic/string copying pythran/pythonic/string/ascii_lowercase.hpp -> build/lib/pythran/pythonic/string copying pythran/pythonic/string/ascii_uppercase.hpp -> build/lib/pythran/pythonic/string copying pythran/pythonic/string/digits.hpp -> build/lib/pythran/pythonic/string copying pythran/pythonic/string/find.hpp -> build/lib/pythran/pythonic/string copying pythran/pythonic/string/hexdigits.hpp -> build/lib/pythran/pythonic/string copying pythran/pythonic/string/octdigits.hpp -> build/lib/pythran/pythonic/string creating build/lib/pythran/pythonic/time copying pythran/pythonic/time/sleep.hpp -> build/lib/pythran/pythonic/time copying pythran/pythonic/time/time.hpp -> build/lib/pythran/pythonic/time creating build/lib/pythran/pythonic/types copying pythran/pythonic/types/NoneType.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/assignable.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/attr.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/bool.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/cfun.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/combined.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/complex.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/complex128.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/complex256.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/complex64.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/dict.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/dynamic_tuple.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/empty_iterator.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/exceptions.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/file.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/finfo.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/float.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/float128.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/float32.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/float64.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/generator.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/int.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/int16.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/int32.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/int64.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/int8.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/intc.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/intp.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/list.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/ndarray.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/nditerator.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/numpy_binary_op.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/numpy_broadcast.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/numpy_expr.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/numpy_gexpr.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/numpy_iexpr.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/numpy_nary_expr.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/numpy_op_helper.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/numpy_operators.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/numpy_texpr.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/numpy_unary_op.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/numpy_vexpr.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/pointer.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/raw_array.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/set.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/slice.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/static_if.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/str.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/traits.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/tuple.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/uint16.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/uint32.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/uint64.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/uint8.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/uintc.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/uintp.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/variant_functor.hpp -> build/lib/pythran/pythonic/types copying pythran/pythonic/types/vectorizable_type.hpp -> build/lib/pythran/pythonic/types creating build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/array_helper.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/broadcast_copy.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/functor.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/fwd.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/int_.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/iterator.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/meta.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/nested_container.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/neutral.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/numpy_conversion.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/numpy_traits.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/pdqsort.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/reserve.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/seq.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/shared_ref.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/tags.hpp -> build/lib/pythran/pythonic/utils copying pythran/pythonic/utils/yield.hpp -> build/lib/pythran/pythonic/utils creating build/lib/pythran/pythonic/builtins/complex copying pythran/pythonic/builtins/complex/conjugate.hpp -> build/lib/pythran/pythonic/builtins/complex creating build/lib/pythran/pythonic/builtins/dict copying pythran/pythonic/builtins/dict/clear.hpp -> build/lib/pythran/pythonic/builtins/dict copying pythran/pythonic/builtins/dict/copy.hpp -> build/lib/pythran/pythonic/builtins/dict copying pythran/pythonic/builtins/dict/fromkeys.hpp -> build/lib/pythran/pythonic/builtins/dict copying pythran/pythonic/builtins/dict/get.hpp -> build/lib/pythran/pythonic/builtins/dict copying pythran/pythonic/builtins/dict/items.hpp -> build/lib/pythran/pythonic/builtins/dict copying pythran/pythonic/builtins/dict/keys.hpp -> build/lib/pythran/pythonic/builtins/dict copying pythran/pythonic/builtins/dict/pop.hpp -> build/lib/pythran/pythonic/builtins/dict copying pythran/pythonic/builtins/dict/popitem.hpp -> build/lib/pythran/pythonic/builtins/dict copying pythran/pythonic/builtins/dict/setdefault.hpp -> build/lib/pythran/pythonic/builtins/dict copying pythran/pythonic/builtins/dict/update.hpp -> build/lib/pythran/pythonic/builtins/dict copying pythran/pythonic/builtins/dict/values.hpp -> build/lib/pythran/pythonic/builtins/dict creating build/lib/pythran/pythonic/builtins/file copying pythran/pythonic/builtins/file/close.hpp -> build/lib/pythran/pythonic/builtins/file copying pythran/pythonic/builtins/file/fileno.hpp -> build/lib/pythran/pythonic/builtins/file copying pythran/pythonic/builtins/file/flush.hpp -> build/lib/pythran/pythonic/builtins/file copying pythran/pythonic/builtins/file/isatty.hpp -> build/lib/pythran/pythonic/builtins/file copying pythran/pythonic/builtins/file/next.hpp -> build/lib/pythran/pythonic/builtins/file copying pythran/pythonic/builtins/file/read.hpp -> build/lib/pythran/pythonic/builtins/file copying pythran/pythonic/builtins/file/readline.hpp -> build/lib/pythran/pythonic/builtins/file copying pythran/pythonic/builtins/file/readlines.hpp -> build/lib/pythran/pythonic/builtins/file copying pythran/pythonic/builtins/file/seek.hpp -> build/lib/pythran/pythonic/builtins/file copying pythran/pythonic/builtins/file/tell.hpp -> build/lib/pythran/pythonic/builtins/file copying pythran/pythonic/builtins/file/truncate.hpp -> build/lib/pythran/pythonic/builtins/file copying pythran/pythonic/builtins/file/write.hpp -> build/lib/pythran/pythonic/builtins/file copying pythran/pythonic/builtins/file/writelines.hpp -> build/lib/pythran/pythonic/builtins/file creating build/lib/pythran/pythonic/builtins/float_ copying pythran/pythonic/builtins/float_/is_integer.hpp -> build/lib/pythran/pythonic/builtins/float_ creating build/lib/pythran/pythonic/builtins/list copying pythran/pythonic/builtins/list/append.hpp -> build/lib/pythran/pythonic/builtins/list copying pythran/pythonic/builtins/list/count.hpp -> build/lib/pythran/pythonic/builtins/list copying pythran/pythonic/builtins/list/extend.hpp -> build/lib/pythran/pythonic/builtins/list copying pythran/pythonic/builtins/list/insert.hpp -> build/lib/pythran/pythonic/builtins/list copying pythran/pythonic/builtins/list/pop.hpp -> build/lib/pythran/pythonic/builtins/list copying pythran/pythonic/builtins/list/remove.hpp -> build/lib/pythran/pythonic/builtins/list copying pythran/pythonic/builtins/list/reverse.hpp -> build/lib/pythran/pythonic/builtins/list copying pythran/pythonic/builtins/list/sort.hpp -> build/lib/pythran/pythonic/builtins/list creating build/lib/pythran/pythonic/builtins/pythran copying pythran/pythonic/builtins/pythran/StaticIfBreak.hpp -> build/lib/pythran/pythonic/builtins/pythran copying pythran/pythonic/builtins/pythran/StaticIfCont.hpp -> build/lib/pythran/pythonic/builtins/pythran copying pythran/pythonic/builtins/pythran/StaticIfNoReturn.hpp -> build/lib/pythran/pythonic/builtins/pythran copying pythran/pythonic/builtins/pythran/StaticIfReturn.hpp -> build/lib/pythran/pythonic/builtins/pythran copying pythran/pythonic/builtins/pythran/abssqr.hpp -> build/lib/pythran/pythonic/builtins/pythran copying pythran/pythonic/builtins/pythran/and_.hpp -> build/lib/pythran/pythonic/builtins/pythran copying pythran/pythonic/builtins/pythran/is_none.hpp -> build/lib/pythran/pythonic/builtins/pythran copying pythran/pythonic/builtins/pythran/kwonly.hpp -> build/lib/pythran/pythonic/builtins/pythran copying pythran/pythonic/builtins/pythran/len_set.hpp -> build/lib/pythran/pythonic/builtins/pythran copying pythran/pythonic/builtins/pythran/make_shape.hpp -> build/lib/pythran/pythonic/builtins/pythran copying pythran/pythonic/builtins/pythran/or_.hpp -> build/lib/pythran/pythonic/builtins/pythran copying pythran/pythonic/builtins/pythran/static_if.hpp -> build/lib/pythran/pythonic/builtins/pythran copying pythran/pythonic/builtins/pythran/static_list.hpp -> build/lib/pythran/pythonic/builtins/pythran creating build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/add.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/clear.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/copy.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/difference.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/difference_update.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/discard.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/intersection.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/intersection_update.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/isdisjoint.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/issubset.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/issuperset.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/remove.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/symmetric_difference.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/symmetric_difference_update.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/union_.hpp -> build/lib/pythran/pythonic/builtins/set copying pythran/pythonic/builtins/set/update.hpp -> build/lib/pythran/pythonic/builtins/set creating build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/__mod__.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/capitalize.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/count.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/endswith.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/find.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/isalpha.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/isdigit.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/join.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/lower.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/lstrip.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/replace.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/rstrip.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/split.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/startswith.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/strip.hpp -> build/lib/pythran/pythonic/builtins/str copying pythran/pythonic/builtins/str/upper.hpp -> build/lib/pythran/pythonic/builtins/str creating build/lib/pythran/pythonic/include creating build/lib/pythran/pythonic/include/__dispatch__ copying pythran/pythonic/include/__dispatch__/clear.hpp -> build/lib/pythran/pythonic/include/__dispatch__ copying pythran/pythonic/include/__dispatch__/conjugate.hpp -> build/lib/pythran/pythonic/include/__dispatch__ copying pythran/pythonic/include/__dispatch__/copy.hpp -> build/lib/pythran/pythonic/include/__dispatch__ copying pythran/pythonic/include/__dispatch__/count.hpp -> build/lib/pythran/pythonic/include/__dispatch__ copying pythran/pythonic/include/__dispatch__/index.hpp -> build/lib/pythran/pythonic/include/__dispatch__ copying pythran/pythonic/include/__dispatch__/pop.hpp -> build/lib/pythran/pythonic/include/__dispatch__ copying pythran/pythonic/include/__dispatch__/remove.hpp -> build/lib/pythran/pythonic/include/__dispatch__ copying pythran/pythonic/include/__dispatch__/sort.hpp -> build/lib/pythran/pythonic/include/__dispatch__ copying pythran/pythonic/include/__dispatch__/update.hpp -> build/lib/pythran/pythonic/include/__dispatch__ creating build/lib/pythran/pythonic/include/bisect copying pythran/pythonic/include/bisect/bisect.hpp -> build/lib/pythran/pythonic/include/bisect copying pythran/pythonic/include/bisect/bisect_left.hpp -> build/lib/pythran/pythonic/include/bisect copying pythran/pythonic/include/bisect/bisect_right.hpp -> build/lib/pythran/pythonic/include/bisect creating build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/ArithmeticError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/AssertionError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/AttributeError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/BaseException.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/BufferError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/BytesWarning.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/DeprecationWarning.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/EOFError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/EnvironmentError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/Exception.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/False.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/FileNotFoundError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/FloatingPointError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/FutureWarning.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/GeneratorExit.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/IOError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/ImportError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/ImportWarning.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/IndentationError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/IndexError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/KeyError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/KeyboardInterrupt.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/LookupError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/MemoryError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/NameError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/None.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/NotImplementedError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/OSError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/OverflowError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/PendingDeprecationWarning.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/ReferenceError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/RuntimeError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/RuntimeWarning.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/StopIteration.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/SyntaxError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/SyntaxWarning.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/SystemError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/SystemExit.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/TabError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/True.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/TypeError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/UnboundLocalError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/UnicodeError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/UnicodeWarning.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/UserWarning.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/ValueError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/Warning.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/ZeroDivisionError.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/abs.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/all.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/any.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/assert.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/bin.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/bool_.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/chr.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/complex.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/dict.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/divmod.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/enumerate.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/file.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/filter.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/float_.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/getattr.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/hex.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/id.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/in.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/int_.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/isinstance.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/iter.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/len.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/list.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/map.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/max.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/min.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/minmax.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/next.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/oct.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/open.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/ord.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/pow.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/print.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/range.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/reduce.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/reversed.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/round.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/set.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/slice.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/sorted.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/str.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/sum.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/tuple.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/type.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/xrange.hpp -> build/lib/pythran/pythonic/include/builtins copying pythran/pythonic/include/builtins/zip.hpp -> build/lib/pythran/pythonic/include/builtins creating build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/acos.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/acosh.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/asin.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/asinh.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/atan.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/atanh.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/cos.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/cosh.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/e.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/exp.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/isinf.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/isnan.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/log.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/log10.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/pi.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/sin.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/sinh.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/sqrt.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/tan.hpp -> build/lib/pythran/pythonic/include/cmath copying pythran/pythonic/include/cmath/tanh.hpp -> build/lib/pythran/pythonic/include/cmath creating build/lib/pythran/pythonic/include/functools copying pythran/pythonic/include/functools/partial.hpp -> build/lib/pythran/pythonic/include/functools copying pythran/pythonic/include/functools/reduce.hpp -> build/lib/pythran/pythonic/include/functools creating build/lib/pythran/pythonic/include/itertools copying pythran/pythonic/include/itertools/combinations.hpp -> build/lib/pythran/pythonic/include/itertools copying pythran/pythonic/include/itertools/common.hpp -> build/lib/pythran/pythonic/include/itertools copying pythran/pythonic/include/itertools/count.hpp -> build/lib/pythran/pythonic/include/itertools copying pythran/pythonic/include/itertools/ifilter.hpp -> build/lib/pythran/pythonic/include/itertools copying pythran/pythonic/include/itertools/islice.hpp -> build/lib/pythran/pythonic/include/itertools copying pythran/pythonic/include/itertools/permutations.hpp -> build/lib/pythran/pythonic/include/itertools copying pythran/pythonic/include/itertools/product.hpp -> build/lib/pythran/pythonic/include/itertools copying pythran/pythonic/include/itertools/repeat.hpp -> build/lib/pythran/pythonic/include/itertools creating build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/acos.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/acosh.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/asin.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/asinh.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/atan.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/atan2.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/atanh.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/ceil.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/copysign.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/cos.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/cosh.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/degrees.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/e.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/erf.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/erfc.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/exp.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/expm1.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/fabs.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/factorial.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/floor.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/fmod.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/frexp.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/gamma.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/hypot.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/isinf.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/isnan.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/ldexp.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/lgamma.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/log.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/log10.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/log1p.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/modf.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/pi.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/pow.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/radians.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/sin.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/sinh.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/sqrt.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/tan.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/tanh.hpp -> build/lib/pythran/pythonic/include/math copying pythran/pythonic/include/math/trunc.hpp -> build/lib/pythran/pythonic/include/math creating build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/NINF.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/abs.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/absolute.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/add.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/alen.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/all.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/allclose.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/alltrue.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/amax.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/amin.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/angle.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/angle_in_deg.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/angle_in_rad.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/any.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/append.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/arange.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/arccos.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/arccosh.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/arcsin.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/arcsinh.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/arctan.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/arctan2.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/arctanh.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/argmax.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/argmin.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/argsort.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/argwhere.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/around.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/array.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/array2string.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/array_equal.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/array_equiv.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/array_split.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/array_str.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/asarray.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/asarray_chkfinite.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ascontiguousarray.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/asfarray.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/asscalar.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/atleast_1d.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/atleast_2d.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/atleast_3d.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/average.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/base_repr.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/binary_repr.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/bincount.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/bitwise_and.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/bitwise_not.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/bitwise_or.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/bitwise_xor.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/bool_.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/broadcast_to.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/byte.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/cbrt.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ceil.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/clip.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/complex.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/complex128.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/complex256.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/complex64.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/concatenate.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/conj.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/conjugate.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/convolve.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/copy.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/copysign.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/copyto.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/correlate.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/cos.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/cosh.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/count_nonzero.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/cross.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/cumprod.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/cumproduct.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/cumsum.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/deg2rad.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/degrees.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/delete_.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/diag.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/diagflat.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/diagonal.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/diff.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/digitize.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/divide.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/dot.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/double_.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/e.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ediff1d.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/empty.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/empty_like.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/equal.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/exp.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/expand_dims.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/expm1.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/eye.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/fabs.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/fill_diagonal.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/finfo.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/fix.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/flatnonzero.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/flip.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/fliplr.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/flipud.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/float128.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/float32.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/float64.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/float_.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/floor.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/floor_divide.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/fmax.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/fmin.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/fmod.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/frexp.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/fromfile.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/fromfunction.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/fromiter.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/fromstring.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/full.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/full_like.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/greater.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/greater_equal.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/heaviside.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/hstack.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/hypot.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/identity.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/imag.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/indices.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/inf.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/inner.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/insert.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/int16.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/int32.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/int64.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/int8.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/int_.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/intc.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/interp.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/intersect1d.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/intp.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/invert.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/isclose.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/iscomplex.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/isfinite.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/isinf.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/isnan.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/isneginf.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/isposinf.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/isreal.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/isrealobj.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/isscalar.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/issctype.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ldexp.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/left_shift.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/less.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/less_equal.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/lexsort.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/linspace.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/log.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/log10.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/log1p.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/log2.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/logaddexp.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/logaddexp2.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/logical_and.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/logical_not.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/logical_or.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/logical_xor.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/logspace.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/longlong.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/max.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/maximum.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/mean.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/median.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/min.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/minimum.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/mod.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/multiply.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/nan.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/nan_to_num.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/nanargmax.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/nanargmin.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/nanmax.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/nanmin.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/nansum.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ndarray.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ndenumerate.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ndim.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ndindex.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/negative.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/newaxis.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/nextafter.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/nonzero.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/not_equal.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ones.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ones_like.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/outer.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/partial_sum.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/pi.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/place.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/power.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/prod.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/product.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ptp.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/put.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/putmask.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/rad2deg.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/radians.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ravel.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/real.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/reciprocal.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/reduce.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/remainder.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/repeat.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/resize.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/right_shift.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/rint.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/roll.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/rollaxis.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/rot90.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/round.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/round_.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/searchsorted.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/select.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/setdiff1d.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/shape.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/short_.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/sign.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/signbit.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/sin.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/sinh.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/size.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/sometrue.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/sort.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/sort_complex.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/spacing.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/split.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/sqrt.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/square.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/stack.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/std_.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/subtract.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/sum.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/swapaxes.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/take.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/tan.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/tanh.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/tile.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/trace.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/transpose.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/tri.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/tril.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/trim_zeros.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/triu.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/true_divide.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/trunc.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ubyte.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ufunc_accumulate.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ufunc_reduce.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/uint.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/uint16.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/uint32.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/uint64.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/uint8.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/uintc.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/uintp.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ulonglong.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/union1d.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/unique.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/unravel_index.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/unwrap.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/ushort.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/var.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/vdot.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/vstack.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/where.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/zeros.hpp -> build/lib/pythran/pythonic/include/numpy copying pythran/pythonic/include/numpy/zeros_like.hpp -> build/lib/pythran/pythonic/include/numpy creating build/lib/pythran/pythonic/include/omp copying pythran/pythonic/include/omp/get_num_threads.hpp -> build/lib/pythran/pythonic/include/omp copying pythran/pythonic/include/omp/get_thread_num.hpp -> build/lib/pythran/pythonic/include/omp copying pythran/pythonic/include/omp/get_wtick.hpp -> build/lib/pythran/pythonic/include/omp copying pythran/pythonic/include/omp/get_wtime.hpp -> build/lib/pythran/pythonic/include/omp copying pythran/pythonic/include/omp/in_parallel.hpp -> build/lib/pythran/pythonic/include/omp copying pythran/pythonic/include/omp/set_nested.hpp -> build/lib/pythran/pythonic/include/omp creating build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__abs__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__add__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__and__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__concat__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__contains__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__delitem__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__div__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__eq__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__floordiv__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__ge__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__getitem__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__gt__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__iadd__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__iand__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__iconcat__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__idiv__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__ifloordiv__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__ilshift__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__imod__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__imul__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__inv__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__invert__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__ior__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__ipow__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__irshift__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__isub__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__itruediv__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__ixor__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__le__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__lshift__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__lt__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__matmul__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__mod__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__mul__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__ne__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__neg__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__not__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__or__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__pos__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__rshift__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__sub__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__truediv__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/__xor__.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/abs.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/add.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/and_.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/concat.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/contains.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/countOf.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/delitem.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/div.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/eq.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/floordiv.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/ge.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/getitem.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/gt.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/iadd.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/iand.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/icommon.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/iconcat.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/idiv.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/ifloordiv.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/ilshift.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/imatmul.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/imax.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/imin.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/imod.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/imul.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/indexOf.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/inv.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/invert.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/ior.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/ipow.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/irshift.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/is_.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/is_not.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/isub.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/itemgetter.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/itruediv.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/ixor.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/le.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/lshift.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/lt.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/matmul.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/mod.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/mul.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/ne.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/neg.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/not_.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/or_.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/overloads.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/pos.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/pow.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/rshift.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/sub.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/truediv.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/truth.hpp -> build/lib/pythran/pythonic/include/operator_ copying pythran/pythonic/include/operator_/xor_.hpp -> build/lib/pythran/pythonic/include/operator_ creating build/lib/pythran/pythonic/include/random copying pythran/pythonic/include/random/choice.hpp -> build/lib/pythran/pythonic/include/random copying pythran/pythonic/include/random/expovariate.hpp -> build/lib/pythran/pythonic/include/random copying pythran/pythonic/include/random/gauss.hpp -> build/lib/pythran/pythonic/include/random copying pythran/pythonic/include/random/randint.hpp -> build/lib/pythran/pythonic/include/random copying pythran/pythonic/include/random/random.hpp -> build/lib/pythran/pythonic/include/random copying pythran/pythonic/include/random/randrange.hpp -> build/lib/pythran/pythonic/include/random copying pythran/pythonic/include/random/sample.hpp -> build/lib/pythran/pythonic/include/random copying pythran/pythonic/include/random/seed.hpp -> build/lib/pythran/pythonic/include/random copying pythran/pythonic/include/random/shuffle.hpp -> build/lib/pythran/pythonic/include/random copying pythran/pythonic/include/random/uniform.hpp -> build/lib/pythran/pythonic/include/random creating build/lib/pythran/pythonic/include/string copying pythran/pythonic/include/string/ascii_letters.hpp -> build/lib/pythran/pythonic/include/string copying pythran/pythonic/include/string/ascii_lowercase.hpp -> build/lib/pythran/pythonic/include/string copying pythran/pythonic/include/string/ascii_uppercase.hpp -> build/lib/pythran/pythonic/include/string copying pythran/pythonic/include/string/digits.hpp -> build/lib/pythran/pythonic/include/string copying pythran/pythonic/include/string/find.hpp -> build/lib/pythran/pythonic/include/string copying pythran/pythonic/include/string/hexdigits.hpp -> build/lib/pythran/pythonic/include/string copying pythran/pythonic/include/string/octdigits.hpp -> build/lib/pythran/pythonic/include/string creating build/lib/pythran/pythonic/include/time copying pythran/pythonic/include/time/sleep.hpp -> build/lib/pythran/pythonic/include/time copying pythran/pythonic/include/time/time.hpp -> build/lib/pythran/pythonic/include/time creating build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/NoneType.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/assignable.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/attr.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/bool.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/cfun.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/combined.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/complex.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/complex128.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/complex256.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/complex64.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/dict.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/dynamic_tuple.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/empty_iterator.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/exceptions.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/file.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/finfo.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/float.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/float128.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/float32.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/float64.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/generator.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/immediate.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/int.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/int16.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/int32.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/int64.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/int8.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/intc.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/intp.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/lazy.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/list.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/ndarray.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/nditerator.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/numpy_binary_op.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/numpy_broadcast.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/numpy_expr.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/numpy_gexpr.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/numpy_iexpr.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/numpy_nary_expr.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/numpy_op_helper.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/numpy_operators.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/numpy_texpr.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/numpy_unary_op.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/numpy_vexpr.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/pointer.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/raw_array.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/set.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/slice.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/static_if.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/str.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/traits.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/tuple.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/uint16.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/uint32.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/uint64.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/uint8.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/uintc.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/uintp.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/variant_functor.hpp -> build/lib/pythran/pythonic/include/types copying pythran/pythonic/include/types/vectorizable_type.hpp -> build/lib/pythran/pythonic/include/types creating build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/array_helper.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/broadcast_copy.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/functor.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/fwd.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/int_.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/iterator.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/meta.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/nested_container.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/neutral.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/numpy_conversion.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/numpy_traits.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/reserve.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/seq.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/shared_ref.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/tags.hpp -> build/lib/pythran/pythonic/include/utils copying pythran/pythonic/include/utils/yield.hpp -> build/lib/pythran/pythonic/include/utils creating build/lib/pythran/pythonic/numpy/add copying pythran/pythonic/numpy/add/accumulate.hpp -> build/lib/pythran/pythonic/numpy/add copying pythran/pythonic/numpy/add/reduce.hpp -> build/lib/pythran/pythonic/numpy/add creating build/lib/pythran/pythonic/numpy/arctan2 copying pythran/pythonic/numpy/arctan2/accumulate.hpp -> build/lib/pythran/pythonic/numpy/arctan2 creating build/lib/pythran/pythonic/numpy/bitwise_and copying pythran/pythonic/numpy/bitwise_and/accumulate.hpp -> build/lib/pythran/pythonic/numpy/bitwise_and copying pythran/pythonic/numpy/bitwise_and/reduce.hpp -> build/lib/pythran/pythonic/numpy/bitwise_and creating build/lib/pythran/pythonic/numpy/bitwise_or copying pythran/pythonic/numpy/bitwise_or/accumulate.hpp -> build/lib/pythran/pythonic/numpy/bitwise_or copying pythran/pythonic/numpy/bitwise_or/reduce.hpp -> build/lib/pythran/pythonic/numpy/bitwise_or creating build/lib/pythran/pythonic/numpy/bitwise_xor copying pythran/pythonic/numpy/bitwise_xor/accumulate.hpp -> build/lib/pythran/pythonic/numpy/bitwise_xor copying pythran/pythonic/numpy/bitwise_xor/reduce.hpp -> build/lib/pythran/pythonic/numpy/bitwise_xor creating build/lib/pythran/pythonic/numpy/copysign copying pythran/pythonic/numpy/copysign/accumulate.hpp -> build/lib/pythran/pythonic/numpy/copysign creating build/lib/pythran/pythonic/numpy/ctypeslib copying pythran/pythonic/numpy/ctypeslib/as_array.hpp -> build/lib/pythran/pythonic/numpy/ctypeslib creating build/lib/pythran/pythonic/numpy/divide copying pythran/pythonic/numpy/divide/accumulate.hpp -> build/lib/pythran/pythonic/numpy/divide creating build/lib/pythran/pythonic/numpy/dtype copying pythran/pythonic/numpy/dtype/type.hpp -> build/lib/pythran/pythonic/numpy/dtype creating build/lib/pythran/pythonic/numpy/equal copying pythran/pythonic/numpy/equal/accumulate.hpp -> build/lib/pythran/pythonic/numpy/equal creating build/lib/pythran/pythonic/numpy/fft copying pythran/pythonic/numpy/fft/c2c.hpp -> build/lib/pythran/pythonic/numpy/fft copying pythran/pythonic/numpy/fft/fft.hpp -> build/lib/pythran/pythonic/numpy/fft copying pythran/pythonic/numpy/fft/hfft.hpp -> build/lib/pythran/pythonic/numpy/fft copying pythran/pythonic/numpy/fft/ifft.hpp -> build/lib/pythran/pythonic/numpy/fft copying pythran/pythonic/numpy/fft/ihfft.hpp -> build/lib/pythran/pythonic/numpy/fft copying pythran/pythonic/numpy/fft/irfft.hpp -> build/lib/pythran/pythonic/numpy/fft copying pythran/pythonic/numpy/fft/pocketfft.hpp -> build/lib/pythran/pythonic/numpy/fft copying pythran/pythonic/numpy/fft/rfft.hpp -> build/lib/pythran/pythonic/numpy/fft creating build/lib/pythran/pythonic/numpy/floor_divide copying pythran/pythonic/numpy/floor_divide/accumulate.hpp -> build/lib/pythran/pythonic/numpy/floor_divide creating build/lib/pythran/pythonic/numpy/fmax copying pythran/pythonic/numpy/fmax/accumulate.hpp -> build/lib/pythran/pythonic/numpy/fmax copying pythran/pythonic/numpy/fmax/reduce.hpp -> build/lib/pythran/pythonic/numpy/fmax creating build/lib/pythran/pythonic/numpy/fmin copying pythran/pythonic/numpy/fmin/accumulate.hpp -> build/lib/pythran/pythonic/numpy/fmin copying pythran/pythonic/numpy/fmin/reduce.hpp -> build/lib/pythran/pythonic/numpy/fmin creating build/lib/pythran/pythonic/numpy/fmod copying pythran/pythonic/numpy/fmod/accumulate.hpp -> build/lib/pythran/pythonic/numpy/fmod creating build/lib/pythran/pythonic/numpy/greater copying pythran/pythonic/numpy/greater/accumulate.hpp -> build/lib/pythran/pythonic/numpy/greater creating build/lib/pythran/pythonic/numpy/greater_equal copying pythran/pythonic/numpy/greater_equal/accumulate.hpp -> build/lib/pythran/pythonic/numpy/greater_equal creating build/lib/pythran/pythonic/numpy/heaviside copying pythran/pythonic/numpy/heaviside/accumulate.hpp -> build/lib/pythran/pythonic/numpy/heaviside creating build/lib/pythran/pythonic/numpy/hypot copying pythran/pythonic/numpy/hypot/accumulate.hpp -> build/lib/pythran/pythonic/numpy/hypot creating build/lib/pythran/pythonic/numpy/ldexp copying pythran/pythonic/numpy/ldexp/accumulate.hpp -> build/lib/pythran/pythonic/numpy/ldexp creating build/lib/pythran/pythonic/numpy/left_shift copying pythran/pythonic/numpy/left_shift/accumulate.hpp -> build/lib/pythran/pythonic/numpy/left_shift creating build/lib/pythran/pythonic/numpy/less copying pythran/pythonic/numpy/less/accumulate.hpp -> build/lib/pythran/pythonic/numpy/less creating build/lib/pythran/pythonic/numpy/less_equal copying pythran/pythonic/numpy/less_equal/accumulate.hpp -> build/lib/pythran/pythonic/numpy/less_equal creating build/lib/pythran/pythonic/numpy/linalg copying pythran/pythonic/numpy/linalg/matrix_power.hpp -> build/lib/pythran/pythonic/numpy/linalg copying pythran/pythonic/numpy/linalg/norm.hpp -> build/lib/pythran/pythonic/numpy/linalg creating build/lib/pythran/pythonic/numpy/logaddexp copying pythran/pythonic/numpy/logaddexp/accumulate.hpp -> build/lib/pythran/pythonic/numpy/logaddexp creating build/lib/pythran/pythonic/numpy/logaddexp2 copying pythran/pythonic/numpy/logaddexp2/accumulate.hpp -> build/lib/pythran/pythonic/numpy/logaddexp2 creating build/lib/pythran/pythonic/numpy/logical_and copying pythran/pythonic/numpy/logical_and/accumulate.hpp -> build/lib/pythran/pythonic/numpy/logical_and creating build/lib/pythran/pythonic/numpy/logical_or copying pythran/pythonic/numpy/logical_or/accumulate.hpp -> build/lib/pythran/pythonic/numpy/logical_or creating build/lib/pythran/pythonic/numpy/logical_xor copying pythran/pythonic/numpy/logical_xor/accumulate.hpp -> build/lib/pythran/pythonic/numpy/logical_xor creating build/lib/pythran/pythonic/numpy/maximum copying pythran/pythonic/numpy/maximum/accumulate.hpp -> build/lib/pythran/pythonic/numpy/maximum copying pythran/pythonic/numpy/maximum/reduce.hpp -> build/lib/pythran/pythonic/numpy/maximum creating build/lib/pythran/pythonic/numpy/minimum copying pythran/pythonic/numpy/minimum/accumulate.hpp -> build/lib/pythran/pythonic/numpy/minimum copying pythran/pythonic/numpy/minimum/reduce.hpp -> build/lib/pythran/pythonic/numpy/minimum creating build/lib/pythran/pythonic/numpy/mod copying pythran/pythonic/numpy/mod/accumulate.hpp -> build/lib/pythran/pythonic/numpy/mod creating build/lib/pythran/pythonic/numpy/multiply copying pythran/pythonic/numpy/multiply/accumulate.hpp -> build/lib/pythran/pythonic/numpy/multiply copying pythran/pythonic/numpy/multiply/reduce.hpp -> build/lib/pythran/pythonic/numpy/multiply creating build/lib/pythran/pythonic/numpy/ndarray copying pythran/pythonic/numpy/ndarray/astype.hpp -> build/lib/pythran/pythonic/numpy/ndarray copying pythran/pythonic/numpy/ndarray/fill.hpp -> build/lib/pythran/pythonic/numpy/ndarray copying pythran/pythonic/numpy/ndarray/flatten.hpp -> build/lib/pythran/pythonic/numpy/ndarray copying pythran/pythonic/numpy/ndarray/item.hpp -> build/lib/pythran/pythonic/numpy/ndarray copying pythran/pythonic/numpy/ndarray/reshape.hpp -> build/lib/pythran/pythonic/numpy/ndarray copying pythran/pythonic/numpy/ndarray/sort.hpp -> build/lib/pythran/pythonic/numpy/ndarray copying pythran/pythonic/numpy/ndarray/tofile.hpp -> build/lib/pythran/pythonic/numpy/ndarray copying pythran/pythonic/numpy/ndarray/tolist.hpp -> build/lib/pythran/pythonic/numpy/ndarray copying pythran/pythonic/numpy/ndarray/tostring.hpp -> build/lib/pythran/pythonic/numpy/ndarray creating build/lib/pythran/pythonic/numpy/negative copying pythran/pythonic/numpy/negative/accumulate.hpp -> build/lib/pythran/pythonic/numpy/negative creating build/lib/pythran/pythonic/numpy/nextafter copying pythran/pythonic/numpy/nextafter/accumulate.hpp -> build/lib/pythran/pythonic/numpy/nextafter creating build/lib/pythran/pythonic/numpy/not_equal copying pythran/pythonic/numpy/not_equal/accumulate.hpp -> build/lib/pythran/pythonic/numpy/not_equal creating build/lib/pythran/pythonic/numpy/power copying pythran/pythonic/numpy/power/accumulate.hpp -> build/lib/pythran/pythonic/numpy/power creating build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/binomial.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/bytes.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/chisquare.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/choice.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/dirichlet.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/exponential.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/f.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/gamma.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/geometric.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/gumbel.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/laplace.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/logistic.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/lognormal.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/logseries.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/negative_binomial.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/normal.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/pareto.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/poisson.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/power.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/rand.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/randint.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/randn.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/random.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/random_integers.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/random_sample.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/ranf.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/rayleigh.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/sample.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/seed.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/shuffle.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/standard_exponential.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/standard_gamma.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/standard_normal.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/uniform.hpp -> build/lib/pythran/pythonic/numpy/random copying pythran/pythonic/numpy/random/weibull.hpp -> build/lib/pythran/pythonic/numpy/random creating build/lib/pythran/pythonic/numpy/remainder copying pythran/pythonic/numpy/remainder/accumulate.hpp -> build/lib/pythran/pythonic/numpy/remainder creating build/lib/pythran/pythonic/numpy/right_shift copying pythran/pythonic/numpy/right_shift/accumulate.hpp -> build/lib/pythran/pythonic/numpy/right_shift creating build/lib/pythran/pythonic/numpy/subtract copying pythran/pythonic/numpy/subtract/accumulate.hpp -> build/lib/pythran/pythonic/numpy/subtract creating build/lib/pythran/pythonic/numpy/true_divide copying pythran/pythonic/numpy/true_divide/accumulate.hpp -> build/lib/pythran/pythonic/numpy/true_divide creating build/lib/pythran/pythonic/os creating build/lib/pythran/pythonic/os/path copying pythran/pythonic/os/path/join.hpp -> build/lib/pythran/pythonic/os/path creating build/lib/pythran/pythonic/scipy creating build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/binom.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/chbevl.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/gamma.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/gammaln.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/hankel1.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/hankel2.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/i0.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/i0e.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/iv.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/ivp.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/jv.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/jvp.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/kv.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/kvp.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/spherical_jn.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/spherical_yn.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/yv.hpp -> build/lib/pythran/pythonic/scipy/special copying pythran/pythonic/scipy/special/yvp.hpp -> build/lib/pythran/pythonic/scipy/special creating build/lib/pythran/pythonic/include/builtins/complex copying pythran/pythonic/include/builtins/complex/conjugate.hpp -> build/lib/pythran/pythonic/include/builtins/complex creating build/lib/pythran/pythonic/include/builtins/dict copying pythran/pythonic/include/builtins/dict/clear.hpp -> build/lib/pythran/pythonic/include/builtins/dict copying pythran/pythonic/include/builtins/dict/copy.hpp -> build/lib/pythran/pythonic/include/builtins/dict copying pythran/pythonic/include/builtins/dict/fromkeys.hpp -> build/lib/pythran/pythonic/include/builtins/dict copying pythran/pythonic/include/builtins/dict/get.hpp -> build/lib/pythran/pythonic/include/builtins/dict copying pythran/pythonic/include/builtins/dict/items.hpp -> build/lib/pythran/pythonic/include/builtins/dict copying pythran/pythonic/include/builtins/dict/keys.hpp -> build/lib/pythran/pythonic/include/builtins/dict copying pythran/pythonic/include/builtins/dict/pop.hpp -> build/lib/pythran/pythonic/include/builtins/dict copying pythran/pythonic/include/builtins/dict/popitem.hpp -> build/lib/pythran/pythonic/include/builtins/dict copying pythran/pythonic/include/builtins/dict/setdefault.hpp -> build/lib/pythran/pythonic/include/builtins/dict copying pythran/pythonic/include/builtins/dict/update.hpp -> build/lib/pythran/pythonic/include/builtins/dict copying pythran/pythonic/include/builtins/dict/values.hpp -> build/lib/pythran/pythonic/include/builtins/dict creating build/lib/pythran/pythonic/include/builtins/file copying pythran/pythonic/include/builtins/file/close.hpp -> build/lib/pythran/pythonic/include/builtins/file copying pythran/pythonic/include/builtins/file/fileno.hpp -> build/lib/pythran/pythonic/include/builtins/file copying pythran/pythonic/include/builtins/file/flush.hpp -> build/lib/pythran/pythonic/include/builtins/file copying pythran/pythonic/include/builtins/file/isatty.hpp -> build/lib/pythran/pythonic/include/builtins/file copying pythran/pythonic/include/builtins/file/next.hpp -> build/lib/pythran/pythonic/include/builtins/file copying pythran/pythonic/include/builtins/file/read.hpp -> build/lib/pythran/pythonic/include/builtins/file copying pythran/pythonic/include/builtins/file/readline.hpp -> build/lib/pythran/pythonic/include/builtins/file copying pythran/pythonic/include/builtins/file/readlines.hpp -> build/lib/pythran/pythonic/include/builtins/file copying pythran/pythonic/include/builtins/file/seek.hpp -> build/lib/pythran/pythonic/include/builtins/file copying pythran/pythonic/include/builtins/file/tell.hpp -> build/lib/pythran/pythonic/include/builtins/file copying pythran/pythonic/include/builtins/file/truncate.hpp -> build/lib/pythran/pythonic/include/builtins/file copying pythran/pythonic/include/builtins/file/write.hpp -> build/lib/pythran/pythonic/include/builtins/file copying pythran/pythonic/include/builtins/file/writelines.hpp -> build/lib/pythran/pythonic/include/builtins/file creating build/lib/pythran/pythonic/include/builtins/float_ copying pythran/pythonic/include/builtins/float_/is_integer.hpp -> build/lib/pythran/pythonic/include/builtins/float_ creating build/lib/pythran/pythonic/include/builtins/list copying pythran/pythonic/include/builtins/list/append.hpp -> build/lib/pythran/pythonic/include/builtins/list copying pythran/pythonic/include/builtins/list/count.hpp -> build/lib/pythran/pythonic/include/builtins/list copying pythran/pythonic/include/builtins/list/extend.hpp -> build/lib/pythran/pythonic/include/builtins/list copying pythran/pythonic/include/builtins/list/insert.hpp -> build/lib/pythran/pythonic/include/builtins/list copying pythran/pythonic/include/builtins/list/pop.hpp -> build/lib/pythran/pythonic/include/builtins/list copying pythran/pythonic/include/builtins/list/remove.hpp -> build/lib/pythran/pythonic/include/builtins/list copying pythran/pythonic/include/builtins/list/reverse.hpp -> build/lib/pythran/pythonic/include/builtins/list copying pythran/pythonic/include/builtins/list/sort.hpp -> build/lib/pythran/pythonic/include/builtins/list creating build/lib/pythran/pythonic/include/builtins/pythran copying pythran/pythonic/include/builtins/pythran/StaticIfBreak.hpp -> build/lib/pythran/pythonic/include/builtins/pythran copying pythran/pythonic/include/builtins/pythran/StaticIfCont.hpp -> build/lib/pythran/pythonic/include/builtins/pythran copying pythran/pythonic/include/builtins/pythran/StaticIfNoReturn.hpp -> build/lib/pythran/pythonic/include/builtins/pythran copying pythran/pythonic/include/builtins/pythran/StaticIfReturn.hpp -> build/lib/pythran/pythonic/include/builtins/pythran copying pythran/pythonic/include/builtins/pythran/abssqr.hpp -> build/lib/pythran/pythonic/include/builtins/pythran copying pythran/pythonic/include/builtins/pythran/and_.hpp -> build/lib/pythran/pythonic/include/builtins/pythran copying pythran/pythonic/include/builtins/pythran/is_none.hpp -> build/lib/pythran/pythonic/include/builtins/pythran copying pythran/pythonic/include/builtins/pythran/kwonly.hpp -> build/lib/pythran/pythonic/include/builtins/pythran copying pythran/pythonic/include/builtins/pythran/len_set.hpp -> build/lib/pythran/pythonic/include/builtins/pythran copying pythran/pythonic/include/builtins/pythran/make_shape.hpp -> build/lib/pythran/pythonic/include/builtins/pythran copying pythran/pythonic/include/builtins/pythran/or_.hpp -> build/lib/pythran/pythonic/include/builtins/pythran copying pythran/pythonic/include/builtins/pythran/static_if.hpp -> build/lib/pythran/pythonic/include/builtins/pythran copying pythran/pythonic/include/builtins/pythran/static_list.hpp -> build/lib/pythran/pythonic/include/builtins/pythran creating build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/add.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/clear.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/copy.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/difference.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/difference_update.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/discard.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/intersection.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/intersection_update.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/isdisjoint.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/issubset.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/issuperset.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/remove.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/symmetric_difference.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/symmetric_difference_update.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/union_.hpp -> build/lib/pythran/pythonic/include/builtins/set copying pythran/pythonic/include/builtins/set/update.hpp -> build/lib/pythran/pythonic/include/builtins/set creating build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/__mod__.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/capitalize.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/count.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/endswith.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/find.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/isalpha.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/isdigit.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/join.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/lower.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/lstrip.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/replace.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/rstrip.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/split.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/startswith.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/strip.hpp -> build/lib/pythran/pythonic/include/builtins/str copying pythran/pythonic/include/builtins/str/upper.hpp -> build/lib/pythran/pythonic/include/builtins/str creating build/lib/pythran/pythonic/include/numpy/add copying pythran/pythonic/include/numpy/add/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/add copying pythran/pythonic/include/numpy/add/reduce.hpp -> build/lib/pythran/pythonic/include/numpy/add creating build/lib/pythran/pythonic/include/numpy/arctan2 copying pythran/pythonic/include/numpy/arctan2/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/arctan2 creating build/lib/pythran/pythonic/include/numpy/bitwise_and copying pythran/pythonic/include/numpy/bitwise_and/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/bitwise_and copying pythran/pythonic/include/numpy/bitwise_and/reduce.hpp -> build/lib/pythran/pythonic/include/numpy/bitwise_and creating build/lib/pythran/pythonic/include/numpy/bitwise_or copying pythran/pythonic/include/numpy/bitwise_or/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/bitwise_or copying pythran/pythonic/include/numpy/bitwise_or/reduce.hpp -> build/lib/pythran/pythonic/include/numpy/bitwise_or creating build/lib/pythran/pythonic/include/numpy/bitwise_xor copying pythran/pythonic/include/numpy/bitwise_xor/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/bitwise_xor copying pythran/pythonic/include/numpy/bitwise_xor/reduce.hpp -> build/lib/pythran/pythonic/include/numpy/bitwise_xor creating build/lib/pythran/pythonic/include/numpy/copysign copying pythran/pythonic/include/numpy/copysign/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/copysign creating build/lib/pythran/pythonic/include/numpy/ctypeslib copying pythran/pythonic/include/numpy/ctypeslib/as_array.hpp -> build/lib/pythran/pythonic/include/numpy/ctypeslib creating build/lib/pythran/pythonic/include/numpy/divide copying pythran/pythonic/include/numpy/divide/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/divide creating build/lib/pythran/pythonic/include/numpy/dtype copying pythran/pythonic/include/numpy/dtype/type.hpp -> build/lib/pythran/pythonic/include/numpy/dtype creating build/lib/pythran/pythonic/include/numpy/equal copying pythran/pythonic/include/numpy/equal/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/equal creating build/lib/pythran/pythonic/include/numpy/fft copying pythran/pythonic/include/numpy/fft/c2c.hpp -> build/lib/pythran/pythonic/include/numpy/fft copying pythran/pythonic/include/numpy/fft/fft.hpp -> build/lib/pythran/pythonic/include/numpy/fft copying pythran/pythonic/include/numpy/fft/hfft.hpp -> build/lib/pythran/pythonic/include/numpy/fft copying pythran/pythonic/include/numpy/fft/ifft.hpp -> build/lib/pythran/pythonic/include/numpy/fft copying pythran/pythonic/include/numpy/fft/ihfft.hpp -> build/lib/pythran/pythonic/include/numpy/fft copying pythran/pythonic/include/numpy/fft/irfft.hpp -> build/lib/pythran/pythonic/include/numpy/fft copying pythran/pythonic/include/numpy/fft/rfft.hpp -> build/lib/pythran/pythonic/include/numpy/fft creating build/lib/pythran/pythonic/include/numpy/floor_divide copying pythran/pythonic/include/numpy/floor_divide/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/floor_divide creating build/lib/pythran/pythonic/include/numpy/fmax copying pythran/pythonic/include/numpy/fmax/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/fmax copying pythran/pythonic/include/numpy/fmax/reduce.hpp -> build/lib/pythran/pythonic/include/numpy/fmax creating build/lib/pythran/pythonic/include/numpy/fmin copying pythran/pythonic/include/numpy/fmin/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/fmin copying pythran/pythonic/include/numpy/fmin/reduce.hpp -> build/lib/pythran/pythonic/include/numpy/fmin creating build/lib/pythran/pythonic/include/numpy/fmod copying pythran/pythonic/include/numpy/fmod/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/fmod creating build/lib/pythran/pythonic/include/numpy/greater copying pythran/pythonic/include/numpy/greater/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/greater creating build/lib/pythran/pythonic/include/numpy/greater_equal copying pythran/pythonic/include/numpy/greater_equal/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/greater_equal creating build/lib/pythran/pythonic/include/numpy/heaviside copying pythran/pythonic/include/numpy/heaviside/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/heaviside creating build/lib/pythran/pythonic/include/numpy/hypot copying pythran/pythonic/include/numpy/hypot/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/hypot creating build/lib/pythran/pythonic/include/numpy/ldexp copying pythran/pythonic/include/numpy/ldexp/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/ldexp creating build/lib/pythran/pythonic/include/numpy/left_shift copying pythran/pythonic/include/numpy/left_shift/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/left_shift creating build/lib/pythran/pythonic/include/numpy/less copying pythran/pythonic/include/numpy/less/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/less creating build/lib/pythran/pythonic/include/numpy/less_equal copying pythran/pythonic/include/numpy/less_equal/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/less_equal creating build/lib/pythran/pythonic/include/numpy/linalg copying pythran/pythonic/include/numpy/linalg/matrix_power.hpp -> build/lib/pythran/pythonic/include/numpy/linalg copying pythran/pythonic/include/numpy/linalg/norm.hpp -> build/lib/pythran/pythonic/include/numpy/linalg creating build/lib/pythran/pythonic/include/numpy/logaddexp copying pythran/pythonic/include/numpy/logaddexp/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/logaddexp creating build/lib/pythran/pythonic/include/numpy/logaddexp2 copying pythran/pythonic/include/numpy/logaddexp2/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/logaddexp2 creating build/lib/pythran/pythonic/include/numpy/logical_and copying pythran/pythonic/include/numpy/logical_and/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/logical_and creating build/lib/pythran/pythonic/include/numpy/logical_or copying pythran/pythonic/include/numpy/logical_or/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/logical_or creating build/lib/pythran/pythonic/include/numpy/logical_xor copying pythran/pythonic/include/numpy/logical_xor/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/logical_xor creating build/lib/pythran/pythonic/include/numpy/maximum copying pythran/pythonic/include/numpy/maximum/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/maximum copying pythran/pythonic/include/numpy/maximum/reduce.hpp -> build/lib/pythran/pythonic/include/numpy/maximum creating build/lib/pythran/pythonic/include/numpy/minimum copying pythran/pythonic/include/numpy/minimum/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/minimum copying pythran/pythonic/include/numpy/minimum/reduce.hpp -> build/lib/pythran/pythonic/include/numpy/minimum creating build/lib/pythran/pythonic/include/numpy/mod copying pythran/pythonic/include/numpy/mod/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/mod creating build/lib/pythran/pythonic/include/numpy/multiply copying pythran/pythonic/include/numpy/multiply/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/multiply copying pythran/pythonic/include/numpy/multiply/reduce.hpp -> build/lib/pythran/pythonic/include/numpy/multiply creating build/lib/pythran/pythonic/include/numpy/ndarray copying pythran/pythonic/include/numpy/ndarray/astype.hpp -> build/lib/pythran/pythonic/include/numpy/ndarray copying pythran/pythonic/include/numpy/ndarray/fill.hpp -> build/lib/pythran/pythonic/include/numpy/ndarray copying pythran/pythonic/include/numpy/ndarray/flatten.hpp -> build/lib/pythran/pythonic/include/numpy/ndarray copying pythran/pythonic/include/numpy/ndarray/item.hpp -> build/lib/pythran/pythonic/include/numpy/ndarray copying pythran/pythonic/include/numpy/ndarray/reshape.hpp -> build/lib/pythran/pythonic/include/numpy/ndarray copying pythran/pythonic/include/numpy/ndarray/sort.hpp -> build/lib/pythran/pythonic/include/numpy/ndarray copying pythran/pythonic/include/numpy/ndarray/tofile.hpp -> build/lib/pythran/pythonic/include/numpy/ndarray copying pythran/pythonic/include/numpy/ndarray/tolist.hpp -> build/lib/pythran/pythonic/include/numpy/ndarray copying pythran/pythonic/include/numpy/ndarray/tostring.hpp -> build/lib/pythran/pythonic/include/numpy/ndarray creating build/lib/pythran/pythonic/include/numpy/negative copying pythran/pythonic/include/numpy/negative/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/negative creating build/lib/pythran/pythonic/include/numpy/nextafter copying pythran/pythonic/include/numpy/nextafter/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/nextafter creating build/lib/pythran/pythonic/include/numpy/not_equal copying pythran/pythonic/include/numpy/not_equal/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/not_equal creating build/lib/pythran/pythonic/include/numpy/power copying pythran/pythonic/include/numpy/power/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/power creating build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/binomial.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/bytes.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/chisquare.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/choice.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/dirichlet.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/exponential.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/f.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/gamma.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/generator.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/geometric.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/gumbel.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/laplace.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/logistic.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/lognormal.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/logseries.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/negative_binomial.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/normal.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/pareto.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/poisson.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/power.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/rand.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/randint.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/randn.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/random.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/random_integers.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/random_sample.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/ranf.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/rayleigh.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/sample.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/seed.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/shuffle.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/standard_exponential.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/standard_gamma.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/standard_normal.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/uniform.hpp -> build/lib/pythran/pythonic/include/numpy/random copying pythran/pythonic/include/numpy/random/weibull.hpp -> build/lib/pythran/pythonic/include/numpy/random creating build/lib/pythran/pythonic/include/numpy/remainder copying pythran/pythonic/include/numpy/remainder/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/remainder creating build/lib/pythran/pythonic/include/numpy/right_shift copying pythran/pythonic/include/numpy/right_shift/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/right_shift creating build/lib/pythran/pythonic/include/numpy/subtract copying pythran/pythonic/include/numpy/subtract/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/subtract creating build/lib/pythran/pythonic/include/numpy/true_divide copying pythran/pythonic/include/numpy/true_divide/accumulate.hpp -> build/lib/pythran/pythonic/include/numpy/true_divide creating build/lib/pythran/pythonic/include/os creating build/lib/pythran/pythonic/include/os/path copying pythran/pythonic/include/os/path/join.hpp -> build/lib/pythran/pythonic/include/os/path creating build/lib/pythran/pythonic/include/scipy creating build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/binom.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/gamma.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/gammaln.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/hankel1.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/hankel2.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/i0.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/i0e.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/iv.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/ivp.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/jv.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/jvp.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/kv.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/kvp.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/spherical_jn.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/spherical_yn.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/yv.hpp -> build/lib/pythran/pythonic/include/scipy/special copying pythran/pythonic/include/scipy/special/yvp.hpp -> build/lib/pythran/pythonic/include/scipy/special creating build/lib/pythran/pythonic/io creating build/lib/pythran/pythonic/io/_io creating build/lib/pythran/pythonic/io/_io/TextIOWrapper copying pythran/pythonic/io/_io/TextIOWrapper/close.hpp -> build/lib/pythran/pythonic/io/_io/TextIOWrapper copying pythran/pythonic/io/_io/TextIOWrapper/fileno.hpp -> build/lib/pythran/pythonic/io/_io/TextIOWrapper copying pythran/pythonic/io/_io/TextIOWrapper/flush.hpp -> build/lib/pythran/pythonic/io/_io/TextIOWrapper copying pythran/pythonic/io/_io/TextIOWrapper/isatty.hpp -> build/lib/pythran/pythonic/io/_io/TextIOWrapper copying pythran/pythonic/io/_io/TextIOWrapper/next.hpp -> build/lib/pythran/pythonic/io/_io/TextIOWrapper copying pythran/pythonic/io/_io/TextIOWrapper/read.hpp -> build/lib/pythran/pythonic/io/_io/TextIOWrapper copying pythran/pythonic/io/_io/TextIOWrapper/readline.hpp -> build/lib/pythran/pythonic/io/_io/TextIOWrapper copying pythran/pythonic/io/_io/TextIOWrapper/readlines.hpp -> build/lib/pythran/pythonic/io/_io/TextIOWrapper copying pythran/pythonic/io/_io/TextIOWrapper/seek.hpp -> build/lib/pythran/pythonic/io/_io/TextIOWrapper copying pythran/pythonic/io/_io/TextIOWrapper/tell.hpp -> build/lib/pythran/pythonic/io/_io/TextIOWrapper copying pythran/pythonic/io/_io/TextIOWrapper/truncate.hpp -> build/lib/pythran/pythonic/io/_io/TextIOWrapper copying pythran/pythonic/io/_io/TextIOWrapper/write.hpp -> build/lib/pythran/pythonic/io/_io/TextIOWrapper copying pythran/pythonic/io/_io/TextIOWrapper/writelines.hpp -> build/lib/pythran/pythonic/io/_io/TextIOWrapper creating build/lib/pythran/pythonic/include/io creating build/lib/pythran/pythonic/include/io/_io creating build/lib/pythran/pythonic/include/io/_io/TextIOWrapper copying pythran/pythonic/include/io/_io/TextIOWrapper/close.hpp -> build/lib/pythran/pythonic/include/io/_io/TextIOWrapper copying pythran/pythonic/include/io/_io/TextIOWrapper/fileno.hpp -> build/lib/pythran/pythonic/include/io/_io/TextIOWrapper copying pythran/pythonic/include/io/_io/TextIOWrapper/flush.hpp -> build/lib/pythran/pythonic/include/io/_io/TextIOWrapper copying pythran/pythonic/include/io/_io/TextIOWrapper/isatty.hpp -> build/lib/pythran/pythonic/include/io/_io/TextIOWrapper copying pythran/pythonic/include/io/_io/TextIOWrapper/next.hpp -> build/lib/pythran/pythonic/include/io/_io/TextIOWrapper copying pythran/pythonic/include/io/_io/TextIOWrapper/read.hpp -> build/lib/pythran/pythonic/include/io/_io/TextIOWrapper copying pythran/pythonic/include/io/_io/TextIOWrapper/readline.hpp -> build/lib/pythran/pythonic/include/io/_io/TextIOWrapper copying pythran/pythonic/include/io/_io/TextIOWrapper/readlines.hpp -> build/lib/pythran/pythonic/include/io/_io/TextIOWrapper copying pythran/pythonic/include/io/_io/TextIOWrapper/seek.hpp -> build/lib/pythran/pythonic/include/io/_io/TextIOWrapper copying pythran/pythonic/include/io/_io/TextIOWrapper/tell.hpp -> build/lib/pythran/pythonic/include/io/_io/TextIOWrapper copying pythran/pythonic/include/io/_io/TextIOWrapper/truncate.hpp -> build/lib/pythran/pythonic/include/io/_io/TextIOWrapper copying pythran/pythonic/include/io/_io/TextIOWrapper/write.hpp -> build/lib/pythran/pythonic/include/io/_io/TextIOWrapper copying pythran/pythonic/include/io/_io/TextIOWrapper/writelines.hpp -> build/lib/pythran/pythonic/include/io/_io/TextIOWrapper creating build/lib/pythran/pythonic/patch copying pythran/pythonic/patch/README.rst -> build/lib/pythran/pythonic/patch copying pythran/pythonic/patch/complex -> build/lib/pythran/pythonic/patch installing to build/bdist.linux-ppc64le/wheel running install running install_lib creating build/bdist.linux-ppc64le creating build/bdist.linux-ppc64le/wheel creating build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/__init__.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/backend.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/config.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/conversion.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/cxxgen.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/cxxtypes.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/dist.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/errors.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/frontend.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/graph.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/interval.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/intrinsic.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/log.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/magic.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/metadata.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/middlend.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/openmp.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/passmanager.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/run.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/spec.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/syntax.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/tables.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/toolchain.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/typing.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/unparse.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/utils.py -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/version.py -> build/bdist.linux-ppc64le/wheel/pythran creating build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/__init__.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/aliases.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/ancestors.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/argument_effects.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/argument_read_once.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/ast_matcher.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/cfg.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/constant_expressions.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/dependencies.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/extended_syntax_check.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/fixed_size_list.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/global_declarations.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/global_effects.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/globals_analysis.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/has_return.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/identifiers.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/immediates.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/imported_ids.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/inlinable.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/is_assigned.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/lazyness_analysis.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/literals.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/local_declarations.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/locals_analysis.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/node_count.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/optimizable_comprehension.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/ordered_global_declarations.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/parallel_maps.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/potential_iterator.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/pure_expressions.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/range_values.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/scope.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/static_expressions.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/use_def_chain.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/use_omp.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses copying build/lib/pythran/analyses/yield_points.py -> build/bdist.linux-ppc64le/wheel/pythran/analyses creating build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/__init__.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/expand_builtins.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/expand_globals.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/expand_import_all.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/expand_imports.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/extract_doc_strings.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/false_polymorphism.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/handle_import.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/normalize_compare.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/normalize_exception.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/normalize_ifelse.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/normalize_is_none.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/normalize_method_calls.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/normalize_return.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/normalize_static_if.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/normalize_tuples.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/remove_comprehension.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/remove_fstrings.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/remove_lambdas.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/remove_named_arguments.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/remove_nested_functions.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations copying build/lib/pythran/transformations/unshadow_parameters.py -> build/bdist.linux-ppc64le/wheel/pythran/transformations creating build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/__init__.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/comprehension_patterns.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/constant_folding.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/dead_code_elimination.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/forward_substitution.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/inline_builtins.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/inlining.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/iter_transformation.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/list_comp_to_genexp.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/list_to_tuple.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/loop_full_unrolling.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/modindex.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/pattern_transform.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/range_based_simplify.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/range_loop_unfolding.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/remove_dead_functions.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/simplify_except.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/square.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations copying build/lib/pythran/optimizations/tuple_to_shape.py -> build/bdist.linux-ppc64le/wheel/pythran/optimizations creating build/bdist.linux-ppc64le/wheel/pythran/types copying build/lib/pythran/types/__init__.py -> build/bdist.linux-ppc64le/wheel/pythran/types copying build/lib/pythran/types/conversion.py -> build/bdist.linux-ppc64le/wheel/pythran/types copying build/lib/pythran/types/reorder.py -> build/bdist.linux-ppc64le/wheel/pythran/types copying build/lib/pythran/types/signature.py -> build/bdist.linux-ppc64le/wheel/pythran/types copying build/lib/pythran/types/tog.py -> build/bdist.linux-ppc64le/wheel/pythran/types copying build/lib/pythran/types/type_dependencies.py -> build/bdist.linux-ppc64le/wheel/pythran/types copying build/lib/pythran/types/types.py -> build/bdist.linux-ppc64le/wheel/pythran/types copying build/lib/pythran/pythran-darwin.cfg -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/pythran-default.cfg -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/pythran-linux.cfg -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/pythran-linux2.cfg -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/pythran-win32.cfg -> build/bdist.linux-ppc64le/wheel/pythran copying build/lib/pythran/pythran.cfg -> build/bdist.linux-ppc64le/wheel/pythran creating build/bdist.linux-ppc64le/wheel/pythran/pythonic copying build/lib/pythran/pythonic/core.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/__dispatch__ copying build/lib/pythran/pythonic/__dispatch__/clear.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/__dispatch__ copying build/lib/pythran/pythonic/__dispatch__/conjugate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/__dispatch__ copying build/lib/pythran/pythonic/__dispatch__/copy.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/__dispatch__ copying build/lib/pythran/pythonic/__dispatch__/count.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/__dispatch__ copying build/lib/pythran/pythonic/__dispatch__/index.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/__dispatch__ copying build/lib/pythran/pythonic/__dispatch__/pop.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/__dispatch__ copying build/lib/pythran/pythonic/__dispatch__/remove.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/__dispatch__ copying build/lib/pythran/pythonic/__dispatch__/sort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/__dispatch__ copying build/lib/pythran/pythonic/__dispatch__/update.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/__dispatch__ creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/bisect copying build/lib/pythran/pythonic/bisect/bisect.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/bisect copying build/lib/pythran/pythonic/bisect/bisect_left.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/bisect copying build/lib/pythran/pythonic/bisect/bisect_right.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/bisect creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/ArithmeticError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/AssertionError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/AttributeError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/BaseException.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/BufferError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/BytesWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/DeprecationWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/EOFError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/EnvironmentError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/Exception.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/False.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/FileNotFoundError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/FloatingPointError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/FutureWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/GeneratorExit.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/IOError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/ImportError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/ImportWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/IndentationError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/IndexError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/KeyError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/KeyboardInterrupt.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/LookupError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/MemoryError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/NameError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/None.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/NotImplementedError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/OSError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/OverflowError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/PendingDeprecationWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/ReferenceError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/RuntimeError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/RuntimeWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/StopIteration.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/SyntaxError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/SyntaxWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/SystemError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/SystemExit.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/TabError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/True.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/TypeError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/UnboundLocalError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/UnicodeError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/UnicodeWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/UserWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/ValueError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/Warning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/ZeroDivisionError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/abs.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/all.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/any.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/assert.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/bin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/bool_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/chr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/complex.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/dict.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/divmod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/enumerate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/file.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/filter.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/float_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/getattr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/hex.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/id.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/in.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/int_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/isinstance.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/iter.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/len.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/list.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/map.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/max.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/min.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/minmax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/next.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/oct.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/open.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/ord.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/pow.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/print.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/range.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/reversed.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/round.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/set.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/slice.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/sorted.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/str.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/sum.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/tuple.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/type.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/xrange.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins copying build/lib/pythran/pythonic/builtins/zip.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/complex copying build/lib/pythran/pythonic/builtins/complex/conjugate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/complex creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/dict copying build/lib/pythran/pythonic/builtins/dict/clear.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/dict copying build/lib/pythran/pythonic/builtins/dict/copy.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/dict copying build/lib/pythran/pythonic/builtins/dict/fromkeys.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/dict copying build/lib/pythran/pythonic/builtins/dict/get.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/dict copying build/lib/pythran/pythonic/builtins/dict/items.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/dict copying build/lib/pythran/pythonic/builtins/dict/keys.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/dict copying build/lib/pythran/pythonic/builtins/dict/pop.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/dict copying build/lib/pythran/pythonic/builtins/dict/popitem.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/dict copying build/lib/pythran/pythonic/builtins/dict/setdefault.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/dict copying build/lib/pythran/pythonic/builtins/dict/update.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/dict copying build/lib/pythran/pythonic/builtins/dict/values.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/dict creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/file copying build/lib/pythran/pythonic/builtins/file/close.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/file copying build/lib/pythran/pythonic/builtins/file/fileno.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/file copying build/lib/pythran/pythonic/builtins/file/flush.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/file copying build/lib/pythran/pythonic/builtins/file/isatty.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/file copying build/lib/pythran/pythonic/builtins/file/next.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/file copying build/lib/pythran/pythonic/builtins/file/read.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/file copying build/lib/pythran/pythonic/builtins/file/readline.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/file copying build/lib/pythran/pythonic/builtins/file/readlines.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/file copying build/lib/pythran/pythonic/builtins/file/seek.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/file copying build/lib/pythran/pythonic/builtins/file/tell.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/file copying build/lib/pythran/pythonic/builtins/file/truncate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/file copying build/lib/pythran/pythonic/builtins/file/write.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/file copying build/lib/pythran/pythonic/builtins/file/writelines.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/file creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/float_ copying build/lib/pythran/pythonic/builtins/float_/is_integer.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/float_ creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/list copying build/lib/pythran/pythonic/builtins/list/append.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/list copying build/lib/pythran/pythonic/builtins/list/count.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/list copying build/lib/pythran/pythonic/builtins/list/extend.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/list copying build/lib/pythran/pythonic/builtins/list/insert.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/list copying build/lib/pythran/pythonic/builtins/list/pop.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/list copying build/lib/pythran/pythonic/builtins/list/remove.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/list copying build/lib/pythran/pythonic/builtins/list/reverse.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/list copying build/lib/pythran/pythonic/builtins/list/sort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/list creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/pythran copying build/lib/pythran/pythonic/builtins/pythran/StaticIfBreak.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/pythran copying build/lib/pythran/pythonic/builtins/pythran/StaticIfCont.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/pythran copying build/lib/pythran/pythonic/builtins/pythran/StaticIfNoReturn.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/pythran copying build/lib/pythran/pythonic/builtins/pythran/StaticIfReturn.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/pythran copying build/lib/pythran/pythonic/builtins/pythran/abssqr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/pythran copying build/lib/pythran/pythonic/builtins/pythran/and_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/pythran copying build/lib/pythran/pythonic/builtins/pythran/is_none.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/pythran copying build/lib/pythran/pythonic/builtins/pythran/kwonly.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/pythran copying build/lib/pythran/pythonic/builtins/pythran/len_set.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/pythran copying build/lib/pythran/pythonic/builtins/pythran/make_shape.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/pythran copying build/lib/pythran/pythonic/builtins/pythran/or_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/pythran copying build/lib/pythran/pythonic/builtins/pythran/static_if.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/pythran copying build/lib/pythran/pythonic/builtins/pythran/static_list.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/pythran creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/add.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/clear.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/copy.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/difference.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/difference_update.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/discard.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/intersection.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/intersection_update.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/isdisjoint.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/issubset.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/issuperset.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/remove.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/symmetric_difference.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/symmetric_difference_update.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/union_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set copying build/lib/pythran/pythonic/builtins/set/update.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/set creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/__mod__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/capitalize.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/count.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/endswith.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/find.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/isalpha.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/isdigit.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/join.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/lower.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/lstrip.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/replace.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/rstrip.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/split.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/startswith.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/strip.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str copying build/lib/pythran/pythonic/builtins/str/upper.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/builtins/str creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/acos.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/acosh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/asin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/asinh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/atan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/atanh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/cos.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/cosh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/e.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/exp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/isinf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/isnan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/log.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/log10.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/pi.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/sin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/sinh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/sqrt.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/tan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath copying build/lib/pythran/pythonic/cmath/tanh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/cmath creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/functools copying build/lib/pythran/pythonic/functools/partial.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/functools copying build/lib/pythran/pythonic/functools/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/functools creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/itertools copying build/lib/pythran/pythonic/itertools/combinations.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/itertools copying build/lib/pythran/pythonic/itertools/common.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/itertools copying build/lib/pythran/pythonic/itertools/count.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/itertools copying build/lib/pythran/pythonic/itertools/ifilter.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/itertools copying build/lib/pythran/pythonic/itertools/islice.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/itertools copying build/lib/pythran/pythonic/itertools/permutations.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/itertools copying build/lib/pythran/pythonic/itertools/product.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/itertools copying build/lib/pythran/pythonic/itertools/repeat.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/itertools creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/acos.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/acosh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/asin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/asinh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/atan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/atan2.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/atanh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/ceil.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/copysign.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/cos.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/cosh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/degrees.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/e.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/erf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/erfc.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/exp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/expm1.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/fabs.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/factorial.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/floor.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/fmod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/frexp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/gamma.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/hypot.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/isinf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/isnan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/ldexp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/lgamma.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/log.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/log10.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/log1p.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/modf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/pi.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/pow.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/radians.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/sin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/sinh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/sqrt.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/tan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/tanh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math copying build/lib/pythran/pythonic/math/trunc.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/math creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/NINF.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/abs.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/absolute.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/add.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/alen.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/all.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/allclose.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/alltrue.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/amax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/amin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/angle.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/angle_in_deg.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/angle_in_rad.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/any.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/append.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/arange.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/arccos.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/arccosh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/arcsin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/arcsinh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/arctan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/arctan2.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/arctanh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/argmax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/argmin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/argminmax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/argsort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/argwhere.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/around.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/array.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/array2string.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/array_equal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/array_equiv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/array_split.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/array_str.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/asarray.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/asarray_chkfinite.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ascontiguousarray.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/asfarray.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/asscalar.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/atleast_1d.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/atleast_2d.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/atleast_3d.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/average.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/base_repr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/binary_repr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/bincount.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/bitwise_and.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/bitwise_not.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/bitwise_or.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/bitwise_xor.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/bool_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/broadcast_to.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/byte.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/cbrt.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ceil.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/clip.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/complex.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/complex128.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/complex256.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/complex64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/concatenate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/conj.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/conjugate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/convolve.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/copy.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/copysign.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/copyto.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/correlate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/cos.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/cosh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/count_nonzero.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/cross.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/cumprod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/cumproduct.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/cumsum.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/deg2rad.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/degrees.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/delete_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/diag.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/diagflat.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/diagonal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/diff.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/digitize.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/divide.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/dot.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/double_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/e.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ediff1d.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/empty.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/empty_like.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/equal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/exp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/expand_dims.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/expm1.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/eye.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/fabs.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/fill_diagonal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/finfo.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/fix.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/flatnonzero.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/flip.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/fliplr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/flipud.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/float128.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/float32.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/float64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/float_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/floor.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/floor_divide.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/fmax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/fmin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/fmod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/frexp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/fromfile.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/fromfunction.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/fromiter.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/fromstring.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/full.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/full_like.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/greater.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/greater_equal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/heaviside.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/hstack.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/hypot.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/identity.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/imag.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/indices.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/inf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/inner.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/insert.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/int16.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/int32.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/int64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/int8.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/int_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/intc.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/interp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/interp_core.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/intersect1d.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/intp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/invert.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/isclose.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/iscomplex.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/isfinite.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/isinf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/isnan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/isneginf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/isposinf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/isreal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/isrealobj.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/isscalar.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/issctype.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ldexp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/left_shift.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/less.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/less_equal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/lexsort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/linspace.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/log.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/log10.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/log1p.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/log2.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/logaddexp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/logaddexp2.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/logical_and.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/logical_not.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/logical_or.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/logical_xor.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/logspace.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/longlong.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/max.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/maximum.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/mean.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/median.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/min.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/minimum.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/mod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/multiply.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/nan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/nan_to_num.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/nanargmax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/nanargmin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/nanmax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/nanmin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/nansum.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ndarray.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ndenumerate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ndim.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ndindex.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/negative.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/newaxis.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/nextafter.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/nonzero.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/not_equal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ones.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ones_like.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/outer.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/partial_sum.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/pi.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/place.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/power.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/prod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/product.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ptp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/put.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/putmask.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/rad2deg.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/radians.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ravel.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/real.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/reciprocal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/remainder.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/repeat.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/resize.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/right_shift.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/rint.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/roll.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/rollaxis.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/rot90.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/round.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/round_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/searchsorted.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/select.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/setdiff1d.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/shape.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/short_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/sign.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/signbit.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/sin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/sinh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/size.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/sometrue.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/sort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/sort_complex.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/spacing.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/split.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/sqrt.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/square.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/stack.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/std_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/subtract.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/sum.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/swapaxes.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/take.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/tan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/tanh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/tile.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/trace.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/transpose.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/tri.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/tril.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/trim_zeros.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/triu.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/true_divide.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/trunc.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ubyte.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ufunc_accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ufunc_reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/uint.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/uint16.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/uint32.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/uint64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/uint8.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/uintc.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/uintp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ulonglong.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/union1d.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/unique.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/unravel_index.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/unwrap.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/ushort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/var.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/vdot.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/vstack.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/where.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/zeros.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy copying build/lib/pythran/pythonic/numpy/zeros_like.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/add copying build/lib/pythran/pythonic/numpy/add/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/add copying build/lib/pythran/pythonic/numpy/add/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/add creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/arctan2 copying build/lib/pythran/pythonic/numpy/arctan2/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/arctan2 creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/bitwise_and copying build/lib/pythran/pythonic/numpy/bitwise_and/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/bitwise_and copying build/lib/pythran/pythonic/numpy/bitwise_and/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/bitwise_and creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/bitwise_or copying build/lib/pythran/pythonic/numpy/bitwise_or/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/bitwise_or copying build/lib/pythran/pythonic/numpy/bitwise_or/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/bitwise_or creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/bitwise_xor copying build/lib/pythran/pythonic/numpy/bitwise_xor/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/bitwise_xor copying build/lib/pythran/pythonic/numpy/bitwise_xor/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/bitwise_xor creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/copysign copying build/lib/pythran/pythonic/numpy/copysign/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/copysign creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/ctypeslib copying build/lib/pythran/pythonic/numpy/ctypeslib/as_array.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/ctypeslib creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/divide copying build/lib/pythran/pythonic/numpy/divide/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/divide creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/dtype copying build/lib/pythran/pythonic/numpy/dtype/type.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/dtype creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/equal copying build/lib/pythran/pythonic/numpy/equal/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/equal creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fft copying build/lib/pythran/pythonic/numpy/fft/c2c.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fft copying build/lib/pythran/pythonic/numpy/fft/fft.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fft copying build/lib/pythran/pythonic/numpy/fft/hfft.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fft copying build/lib/pythran/pythonic/numpy/fft/ifft.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fft copying build/lib/pythran/pythonic/numpy/fft/ihfft.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fft copying build/lib/pythran/pythonic/numpy/fft/irfft.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fft copying build/lib/pythran/pythonic/numpy/fft/pocketfft.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fft copying build/lib/pythran/pythonic/numpy/fft/rfft.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fft creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/floor_divide copying build/lib/pythran/pythonic/numpy/floor_divide/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/floor_divide creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fmax copying build/lib/pythran/pythonic/numpy/fmax/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fmax copying build/lib/pythran/pythonic/numpy/fmax/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fmax creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fmin copying build/lib/pythran/pythonic/numpy/fmin/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fmin copying build/lib/pythran/pythonic/numpy/fmin/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fmin creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fmod copying build/lib/pythran/pythonic/numpy/fmod/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/fmod creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/greater copying build/lib/pythran/pythonic/numpy/greater/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/greater creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/greater_equal copying build/lib/pythran/pythonic/numpy/greater_equal/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/greater_equal creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/heaviside copying build/lib/pythran/pythonic/numpy/heaviside/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/heaviside creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/hypot copying build/lib/pythran/pythonic/numpy/hypot/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/hypot creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/ldexp copying build/lib/pythran/pythonic/numpy/ldexp/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/ldexp creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/left_shift copying build/lib/pythran/pythonic/numpy/left_shift/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/left_shift creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/less copying build/lib/pythran/pythonic/numpy/less/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/less creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/less_equal copying build/lib/pythran/pythonic/numpy/less_equal/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/less_equal creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/linalg copying build/lib/pythran/pythonic/numpy/linalg/matrix_power.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/linalg copying build/lib/pythran/pythonic/numpy/linalg/norm.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/linalg creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/logaddexp copying build/lib/pythran/pythonic/numpy/logaddexp/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/logaddexp creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/logaddexp2 copying build/lib/pythran/pythonic/numpy/logaddexp2/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/logaddexp2 creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/logical_and copying build/lib/pythran/pythonic/numpy/logical_and/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/logical_and creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/logical_or copying build/lib/pythran/pythonic/numpy/logical_or/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/logical_or creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/logical_xor copying build/lib/pythran/pythonic/numpy/logical_xor/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/logical_xor creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/maximum copying build/lib/pythran/pythonic/numpy/maximum/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/maximum copying build/lib/pythran/pythonic/numpy/maximum/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/maximum creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/minimum copying build/lib/pythran/pythonic/numpy/minimum/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/minimum copying build/lib/pythran/pythonic/numpy/minimum/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/minimum creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/mod copying build/lib/pythran/pythonic/numpy/mod/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/mod creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/multiply copying build/lib/pythran/pythonic/numpy/multiply/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/multiply copying build/lib/pythran/pythonic/numpy/multiply/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/multiply creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/ndarray copying build/lib/pythran/pythonic/numpy/ndarray/astype.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/ndarray copying build/lib/pythran/pythonic/numpy/ndarray/fill.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/ndarray copying build/lib/pythran/pythonic/numpy/ndarray/flatten.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/ndarray copying build/lib/pythran/pythonic/numpy/ndarray/item.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/ndarray copying build/lib/pythran/pythonic/numpy/ndarray/reshape.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/ndarray copying build/lib/pythran/pythonic/numpy/ndarray/sort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/ndarray copying build/lib/pythran/pythonic/numpy/ndarray/tofile.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/ndarray copying build/lib/pythran/pythonic/numpy/ndarray/tolist.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/ndarray copying build/lib/pythran/pythonic/numpy/ndarray/tostring.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/ndarray creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/negative copying build/lib/pythran/pythonic/numpy/negative/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/negative creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/nextafter copying build/lib/pythran/pythonic/numpy/nextafter/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/nextafter creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/not_equal copying build/lib/pythran/pythonic/numpy/not_equal/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/not_equal creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/power copying build/lib/pythran/pythonic/numpy/power/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/power creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/binomial.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/bytes.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/chisquare.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/choice.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/dirichlet.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/exponential.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/f.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/gamma.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/geometric.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/gumbel.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/laplace.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/logistic.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/lognormal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/logseries.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/negative_binomial.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/normal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/pareto.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/poisson.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/power.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/rand.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/randint.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/randn.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/random.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/random_integers.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/random_sample.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/ranf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/rayleigh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/sample.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/seed.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/shuffle.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/standard_exponential.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/standard_gamma.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/standard_normal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/uniform.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random copying build/lib/pythran/pythonic/numpy/random/weibull.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/random creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/remainder copying build/lib/pythran/pythonic/numpy/remainder/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/remainder creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/right_shift copying build/lib/pythran/pythonic/numpy/right_shift/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/right_shift creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/subtract copying build/lib/pythran/pythonic/numpy/subtract/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/subtract creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/true_divide copying build/lib/pythran/pythonic/numpy/true_divide/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/numpy/true_divide creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/omp copying build/lib/pythran/pythonic/omp/get_num_threads.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/omp copying build/lib/pythran/pythonic/omp/get_thread_num.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/omp copying build/lib/pythran/pythonic/omp/get_wtick.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/omp copying build/lib/pythran/pythonic/omp/get_wtime.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/omp copying build/lib/pythran/pythonic/omp/in_parallel.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/omp copying build/lib/pythran/pythonic/omp/set_nested.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/omp creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__abs__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__add__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__and__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__concat__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__contains__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__delitem__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__div__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__eq__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__floordiv__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__ge__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__getitem__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__gt__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__iadd__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__iand__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__iconcat__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__idiv__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__ifloordiv__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__ilshift__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__imod__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__imul__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__inv__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__invert__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__ior__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__ipow__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__irshift__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__isub__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__itruediv__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__ixor__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__le__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__lshift__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__lt__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__matmul__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__mod__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__mul__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__ne__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__neg__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__not__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__or__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__pos__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__rshift__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__sub__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__truediv__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/__xor__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/abs.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/add.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/and_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/concat.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/contains.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/countOf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/delitem.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/div.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/eq.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/floordiv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/ge.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/getitem.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/gt.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/iadd.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/iand.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/icommon.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/iconcat.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/idiv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/ifloordiv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/ilshift.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/imatmul.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/imax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/imin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/imod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/imul.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/indexOf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/inv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/invert.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/ior.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/ipow.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/irshift.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/is_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/is_not.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/isub.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/itemgetter.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/itruediv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/ixor.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/le.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/lshift.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/lt.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/matmul.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/mod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/mul.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/ne.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/neg.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/not_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/or_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/overloads.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/pos.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/pow.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/rshift.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/sub.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/truediv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/truth.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ copying build/lib/pythran/pythonic/operator_/xor_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/operator_ creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/python copying build/lib/pythran/pythonic/python/core.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/python copying build/lib/pythran/pythonic/python/exception_handler.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/python creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/random copying build/lib/pythran/pythonic/random/choice.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/random copying build/lib/pythran/pythonic/random/expovariate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/random copying build/lib/pythran/pythonic/random/gauss.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/random copying build/lib/pythran/pythonic/random/randint.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/random copying build/lib/pythran/pythonic/random/random.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/random copying build/lib/pythran/pythonic/random/randrange.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/random copying build/lib/pythran/pythonic/random/sample.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/random copying build/lib/pythran/pythonic/random/seed.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/random copying build/lib/pythran/pythonic/random/shuffle.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/random copying build/lib/pythran/pythonic/random/uniform.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/random creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/string copying build/lib/pythran/pythonic/string/ascii_letters.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/string copying build/lib/pythran/pythonic/string/ascii_lowercase.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/string copying build/lib/pythran/pythonic/string/ascii_uppercase.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/string copying build/lib/pythran/pythonic/string/digits.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/string copying build/lib/pythran/pythonic/string/find.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/string copying build/lib/pythran/pythonic/string/hexdigits.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/string copying build/lib/pythran/pythonic/string/octdigits.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/string creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/time copying build/lib/pythran/pythonic/time/sleep.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/time copying build/lib/pythran/pythonic/time/time.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/time creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/NoneType.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/assignable.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/attr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/bool.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/cfun.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/combined.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/complex.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/complex128.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/complex256.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/complex64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/dict.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/dynamic_tuple.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/empty_iterator.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/exceptions.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/file.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/finfo.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/float.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/float128.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/float32.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/float64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/generator.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/int.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/int16.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/int32.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/int64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/int8.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/intc.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/intp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/list.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/ndarray.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/nditerator.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/numpy_binary_op.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/numpy_broadcast.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/numpy_expr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/numpy_gexpr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/numpy_iexpr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/numpy_nary_expr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/numpy_op_helper.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/numpy_operators.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/numpy_texpr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/numpy_unary_op.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/numpy_vexpr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/pointer.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/raw_array.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/set.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/slice.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/static_if.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/str.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/traits.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/tuple.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/uint16.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/uint32.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/uint64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/uint8.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/uintc.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/uintp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/variant_functor.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types copying build/lib/pythran/pythonic/types/vectorizable_type.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/types creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/array_helper.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/broadcast_copy.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/functor.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/fwd.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/int_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/iterator.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/meta.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/nested_container.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/neutral.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/numpy_conversion.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/numpy_traits.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/pdqsort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/reserve.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/seq.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/shared_ref.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/tags.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils copying build/lib/pythran/pythonic/utils/yield.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/utils creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/__dispatch__ copying build/lib/pythran/pythonic/include/__dispatch__/clear.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/__dispatch__ copying build/lib/pythran/pythonic/include/__dispatch__/conjugate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/__dispatch__ copying build/lib/pythran/pythonic/include/__dispatch__/copy.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/__dispatch__ copying build/lib/pythran/pythonic/include/__dispatch__/count.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/__dispatch__ copying build/lib/pythran/pythonic/include/__dispatch__/index.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/__dispatch__ copying build/lib/pythran/pythonic/include/__dispatch__/pop.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/__dispatch__ copying build/lib/pythran/pythonic/include/__dispatch__/remove.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/__dispatch__ copying build/lib/pythran/pythonic/include/__dispatch__/sort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/__dispatch__ copying build/lib/pythran/pythonic/include/__dispatch__/update.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/__dispatch__ creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/bisect copying build/lib/pythran/pythonic/include/bisect/bisect.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/bisect copying build/lib/pythran/pythonic/include/bisect/bisect_left.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/bisect copying build/lib/pythran/pythonic/include/bisect/bisect_right.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/bisect creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/ArithmeticError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/AssertionError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/AttributeError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/BaseException.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/BufferError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/BytesWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/DeprecationWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/EOFError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/EnvironmentError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/Exception.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/False.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/FileNotFoundError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/FloatingPointError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/FutureWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/GeneratorExit.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/IOError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/ImportError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/ImportWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/IndentationError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/IndexError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/KeyError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/KeyboardInterrupt.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/LookupError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/MemoryError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/NameError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/None.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/NotImplementedError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/OSError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/OverflowError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/PendingDeprecationWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/ReferenceError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/RuntimeError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/RuntimeWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/StopIteration.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/SyntaxError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/SyntaxWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/SystemError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/SystemExit.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/TabError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/True.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/TypeError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/UnboundLocalError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/UnicodeError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/UnicodeWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/UserWarning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/ValueError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/Warning.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/ZeroDivisionError.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/abs.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/all.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/any.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/assert.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/bin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/bool_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/chr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/complex.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/dict.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/divmod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/enumerate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/file.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/filter.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/float_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/getattr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/hex.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/id.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/in.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/int_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/isinstance.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/iter.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/len.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/list.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/map.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/max.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/min.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/minmax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/next.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/oct.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/open.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/ord.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/pow.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/print.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/range.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/reversed.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/round.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/set.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/slice.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/sorted.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/str.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/sum.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/tuple.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/type.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/xrange.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins copying build/lib/pythran/pythonic/include/builtins/zip.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/complex copying build/lib/pythran/pythonic/include/builtins/complex/conjugate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/complex creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/dict copying build/lib/pythran/pythonic/include/builtins/dict/clear.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/dict copying build/lib/pythran/pythonic/include/builtins/dict/copy.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/dict copying build/lib/pythran/pythonic/include/builtins/dict/fromkeys.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/dict copying build/lib/pythran/pythonic/include/builtins/dict/get.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/dict copying build/lib/pythran/pythonic/include/builtins/dict/items.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/dict copying build/lib/pythran/pythonic/include/builtins/dict/keys.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/dict copying build/lib/pythran/pythonic/include/builtins/dict/pop.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/dict copying build/lib/pythran/pythonic/include/builtins/dict/popitem.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/dict copying build/lib/pythran/pythonic/include/builtins/dict/setdefault.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/dict copying build/lib/pythran/pythonic/include/builtins/dict/update.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/dict copying build/lib/pythran/pythonic/include/builtins/dict/values.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/dict creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/file copying build/lib/pythran/pythonic/include/builtins/file/close.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/file copying build/lib/pythran/pythonic/include/builtins/file/fileno.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/file copying build/lib/pythran/pythonic/include/builtins/file/flush.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/file copying build/lib/pythran/pythonic/include/builtins/file/isatty.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/file copying build/lib/pythran/pythonic/include/builtins/file/next.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/file copying build/lib/pythran/pythonic/include/builtins/file/read.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/file copying build/lib/pythran/pythonic/include/builtins/file/readline.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/file copying build/lib/pythran/pythonic/include/builtins/file/readlines.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/file copying build/lib/pythran/pythonic/include/builtins/file/seek.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/file copying build/lib/pythran/pythonic/include/builtins/file/tell.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/file copying build/lib/pythran/pythonic/include/builtins/file/truncate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/file copying build/lib/pythran/pythonic/include/builtins/file/write.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/file copying build/lib/pythran/pythonic/include/builtins/file/writelines.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/file creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/float_ copying build/lib/pythran/pythonic/include/builtins/float_/is_integer.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/float_ creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/list copying build/lib/pythran/pythonic/include/builtins/list/append.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/list copying build/lib/pythran/pythonic/include/builtins/list/count.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/list copying build/lib/pythran/pythonic/include/builtins/list/extend.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/list copying build/lib/pythran/pythonic/include/builtins/list/insert.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/list copying build/lib/pythran/pythonic/include/builtins/list/pop.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/list copying build/lib/pythran/pythonic/include/builtins/list/remove.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/list copying build/lib/pythran/pythonic/include/builtins/list/reverse.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/list copying build/lib/pythran/pythonic/include/builtins/list/sort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/list creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/pythran copying build/lib/pythran/pythonic/include/builtins/pythran/StaticIfBreak.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/pythran copying build/lib/pythran/pythonic/include/builtins/pythran/StaticIfCont.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/pythran copying build/lib/pythran/pythonic/include/builtins/pythran/StaticIfNoReturn.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/pythran copying build/lib/pythran/pythonic/include/builtins/pythran/StaticIfReturn.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/pythran copying build/lib/pythran/pythonic/include/builtins/pythran/abssqr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/pythran copying build/lib/pythran/pythonic/include/builtins/pythran/and_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/pythran copying build/lib/pythran/pythonic/include/builtins/pythran/is_none.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/pythran copying build/lib/pythran/pythonic/include/builtins/pythran/kwonly.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/pythran copying build/lib/pythran/pythonic/include/builtins/pythran/len_set.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/pythran copying build/lib/pythran/pythonic/include/builtins/pythran/make_shape.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/pythran copying build/lib/pythran/pythonic/include/builtins/pythran/or_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/pythran copying build/lib/pythran/pythonic/include/builtins/pythran/static_if.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/pythran copying build/lib/pythran/pythonic/include/builtins/pythran/static_list.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/pythran creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/add.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/clear.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/copy.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/difference.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/difference_update.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/discard.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/intersection.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/intersection_update.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/isdisjoint.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/issubset.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/issuperset.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/remove.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/symmetric_difference.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/symmetric_difference_update.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/union_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set copying build/lib/pythran/pythonic/include/builtins/set/update.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/set creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/__mod__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/capitalize.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/count.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/endswith.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/find.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/isalpha.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/isdigit.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/join.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/lower.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/lstrip.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/replace.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/rstrip.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/split.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/startswith.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/strip.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str copying build/lib/pythran/pythonic/include/builtins/str/upper.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/builtins/str creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/acos.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/acosh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/asin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/asinh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/atan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/atanh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/cos.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/cosh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/e.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/exp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/isinf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/isnan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/log.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/log10.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/pi.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/sin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/sinh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/sqrt.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/tan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath copying build/lib/pythran/pythonic/include/cmath/tanh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/cmath creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/functools copying build/lib/pythran/pythonic/include/functools/partial.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/functools copying build/lib/pythran/pythonic/include/functools/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/functools creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/itertools copying build/lib/pythran/pythonic/include/itertools/combinations.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/itertools copying build/lib/pythran/pythonic/include/itertools/common.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/itertools copying build/lib/pythran/pythonic/include/itertools/count.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/itertools copying build/lib/pythran/pythonic/include/itertools/ifilter.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/itertools copying build/lib/pythran/pythonic/include/itertools/islice.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/itertools copying build/lib/pythran/pythonic/include/itertools/permutations.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/itertools copying build/lib/pythran/pythonic/include/itertools/product.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/itertools copying build/lib/pythran/pythonic/include/itertools/repeat.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/itertools creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/acos.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/acosh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/asin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/asinh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/atan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/atan2.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/atanh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/ceil.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/copysign.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/cos.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/cosh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/degrees.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/e.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/erf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/erfc.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/exp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/expm1.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/fabs.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/factorial.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/floor.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/fmod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/frexp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/gamma.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/hypot.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/isinf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/isnan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/ldexp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/lgamma.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/log.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/log10.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/log1p.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/modf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/pi.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/pow.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/radians.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/sin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/sinh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/sqrt.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/tan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/tanh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math copying build/lib/pythran/pythonic/include/math/trunc.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/math creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/NINF.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/abs.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/absolute.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/add.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/alen.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/all.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/allclose.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/alltrue.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/amax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/amin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/angle.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/angle_in_deg.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/angle_in_rad.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/any.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/append.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/arange.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/arccos.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/arccosh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/arcsin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/arcsinh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/arctan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/arctan2.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/arctanh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/argmax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/argmin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/argsort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/argwhere.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/around.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/array.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/array2string.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/array_equal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/array_equiv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/array_split.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/array_str.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/asarray.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/asarray_chkfinite.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ascontiguousarray.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/asfarray.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/asscalar.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/atleast_1d.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/atleast_2d.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/atleast_3d.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/average.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/base_repr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/binary_repr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/bincount.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/bitwise_and.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/bitwise_not.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/bitwise_or.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/bitwise_xor.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/bool_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/broadcast_to.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/byte.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/cbrt.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ceil.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/clip.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/complex.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/complex128.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/complex256.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/complex64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/concatenate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/conj.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/conjugate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/convolve.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/copy.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/copysign.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/copyto.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/correlate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/cos.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/cosh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/count_nonzero.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/cross.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/cumprod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/cumproduct.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/cumsum.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/deg2rad.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/degrees.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/delete_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/diag.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/diagflat.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/diagonal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/diff.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/digitize.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/divide.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/dot.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/double_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/e.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ediff1d.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/empty.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/empty_like.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/equal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/exp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/expand_dims.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/expm1.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/eye.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/fabs.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/fill_diagonal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/finfo.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/fix.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/flatnonzero.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/flip.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/fliplr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/flipud.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/float128.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/float32.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/float64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/float_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/floor.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/floor_divide.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/fmax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/fmin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/fmod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/frexp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/fromfile.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/fromfunction.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/fromiter.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/fromstring.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/full.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/full_like.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/greater.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/greater_equal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/heaviside.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/hstack.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/hypot.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/identity.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/imag.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/indices.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/inf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/inner.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/insert.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/int16.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/int32.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/int64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/int8.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/int_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/intc.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/interp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/intersect1d.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/intp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/invert.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/isclose.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/iscomplex.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/isfinite.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/isinf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/isnan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/isneginf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/isposinf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/isreal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/isrealobj.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/isscalar.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/issctype.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ldexp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/add copying build/lib/pythran/pythonic/include/numpy/add/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/add copying build/lib/pythran/pythonic/include/numpy/add/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/add copying build/lib/pythran/pythonic/include/numpy/left_shift.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/less.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/less_equal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/lexsort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/linspace.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/log.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/log10.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/log1p.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/log2.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/logaddexp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/logaddexp2.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/logical_and.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/logical_not.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/logical_or.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/logical_xor.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/logspace.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/longlong.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/max.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/maximum.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/mean.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/median.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/min.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/minimum.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/mod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/multiply.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/nan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/nan_to_num.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/nanargmax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/nanargmin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/nanmax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/nanmin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/nansum.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ndarray.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ndenumerate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ndim.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ndindex.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/negative.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/newaxis.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/nextafter.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/nonzero.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/not_equal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ones.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ones_like.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/outer.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/partial_sum.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/pi.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/place.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/power.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/prod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/product.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ptp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/put.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/putmask.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/rad2deg.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/radians.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ravel.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/real.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/reciprocal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/remainder.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/repeat.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/resize.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/right_shift.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/rint.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/roll.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/rollaxis.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/rot90.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/round.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/round_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/searchsorted.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/select.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/setdiff1d.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/shape.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/short_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/sign.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/signbit.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/sin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/sinh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/size.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/sometrue.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/sort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/sort_complex.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/spacing.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/split.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/sqrt.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/square.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/stack.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/std_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/subtract.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/sum.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/swapaxes.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/take.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/tan.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/tanh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/tile.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/trace.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/transpose.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/tri.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/tril.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/trim_zeros.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/triu.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/true_divide.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/trunc.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ubyte.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ufunc_accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ufunc_reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/uint.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/uint16.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/uint32.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/uint64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/uint8.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/uintc.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/uintp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ulonglong.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/union1d.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/unique.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/unravel_index.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/unwrap.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/ushort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/var.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/vdot.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/vstack.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/where.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/zeros.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy copying build/lib/pythran/pythonic/include/numpy/zeros_like.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/arctan2 copying build/lib/pythran/pythonic/include/numpy/arctan2/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/arctan2 creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/bitwise_and copying build/lib/pythran/pythonic/include/numpy/bitwise_and/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/bitwise_and copying build/lib/pythran/pythonic/include/numpy/bitwise_and/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/bitwise_and creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/bitwise_or copying build/lib/pythran/pythonic/include/numpy/bitwise_or/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/bitwise_or copying build/lib/pythran/pythonic/include/numpy/bitwise_or/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/bitwise_or creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/bitwise_xor copying build/lib/pythran/pythonic/include/numpy/bitwise_xor/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/bitwise_xor copying build/lib/pythran/pythonic/include/numpy/bitwise_xor/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/bitwise_xor creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/copysign copying build/lib/pythran/pythonic/include/numpy/copysign/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/copysign creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/ctypeslib copying build/lib/pythran/pythonic/include/numpy/ctypeslib/as_array.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/ctypeslib creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/divide copying build/lib/pythran/pythonic/include/numpy/divide/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/divide creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/dtype copying build/lib/pythran/pythonic/include/numpy/dtype/type.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/dtype creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/equal copying build/lib/pythran/pythonic/include/numpy/equal/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/equal creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fft copying build/lib/pythran/pythonic/include/numpy/fft/c2c.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fft copying build/lib/pythran/pythonic/include/numpy/fft/fft.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fft copying build/lib/pythran/pythonic/include/numpy/fft/hfft.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fft copying build/lib/pythran/pythonic/include/numpy/fft/ifft.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fft copying build/lib/pythran/pythonic/include/numpy/fft/ihfft.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fft copying build/lib/pythran/pythonic/include/numpy/fft/irfft.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fft copying build/lib/pythran/pythonic/include/numpy/fft/rfft.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fft creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/floor_divide copying build/lib/pythran/pythonic/include/numpy/floor_divide/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/floor_divide creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fmax copying build/lib/pythran/pythonic/include/numpy/fmax/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fmax copying build/lib/pythran/pythonic/include/numpy/fmax/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fmax creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fmin copying build/lib/pythran/pythonic/include/numpy/fmin/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fmin copying build/lib/pythran/pythonic/include/numpy/fmin/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fmin creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fmod copying build/lib/pythran/pythonic/include/numpy/fmod/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/fmod creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/greater copying build/lib/pythran/pythonic/include/numpy/greater/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/greater creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/greater_equal copying build/lib/pythran/pythonic/include/numpy/greater_equal/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/greater_equal creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/heaviside copying build/lib/pythran/pythonic/include/numpy/heaviside/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/heaviside creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/hypot copying build/lib/pythran/pythonic/include/numpy/hypot/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/hypot creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/ldexp copying build/lib/pythran/pythonic/include/numpy/ldexp/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/ldexp creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/left_shift copying build/lib/pythran/pythonic/include/numpy/left_shift/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/left_shift creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/less copying build/lib/pythran/pythonic/include/numpy/less/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/less creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/less_equal copying build/lib/pythran/pythonic/include/numpy/less_equal/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/less_equal creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/linalg copying build/lib/pythran/pythonic/include/numpy/linalg/matrix_power.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/linalg copying build/lib/pythran/pythonic/include/numpy/linalg/norm.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/linalg creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/logaddexp copying build/lib/pythran/pythonic/include/numpy/logaddexp/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/logaddexp creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/logaddexp2 copying build/lib/pythran/pythonic/include/numpy/logaddexp2/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/logaddexp2 creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/logical_and copying build/lib/pythran/pythonic/include/numpy/logical_and/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/logical_and creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/logical_or copying build/lib/pythran/pythonic/include/numpy/logical_or/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/logical_or creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/logical_xor copying build/lib/pythran/pythonic/include/numpy/logical_xor/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/logical_xor creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/maximum copying build/lib/pythran/pythonic/include/numpy/maximum/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/maximum copying build/lib/pythran/pythonic/include/numpy/maximum/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/maximum creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/minimum copying build/lib/pythran/pythonic/include/numpy/minimum/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/minimum copying build/lib/pythran/pythonic/include/numpy/minimum/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/minimum creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/mod copying build/lib/pythran/pythonic/include/numpy/mod/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/mod creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/multiply copying build/lib/pythran/pythonic/include/numpy/multiply/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/multiply copying build/lib/pythran/pythonic/include/numpy/multiply/reduce.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/multiply creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/ndarray copying build/lib/pythran/pythonic/include/numpy/ndarray/astype.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/ndarray copying build/lib/pythran/pythonic/include/numpy/ndarray/fill.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/ndarray copying build/lib/pythran/pythonic/include/numpy/ndarray/flatten.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/ndarray copying build/lib/pythran/pythonic/include/numpy/ndarray/item.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/ndarray copying build/lib/pythran/pythonic/include/numpy/ndarray/reshape.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/ndarray copying build/lib/pythran/pythonic/include/numpy/ndarray/sort.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/ndarray copying build/lib/pythran/pythonic/include/numpy/ndarray/tofile.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/ndarray copying build/lib/pythran/pythonic/include/numpy/ndarray/tolist.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/ndarray copying build/lib/pythran/pythonic/include/numpy/ndarray/tostring.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/ndarray creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/negative copying build/lib/pythran/pythonic/include/numpy/negative/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/negative creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/nextafter copying build/lib/pythran/pythonic/include/numpy/nextafter/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/nextafter creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/not_equal copying build/lib/pythran/pythonic/include/numpy/not_equal/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/not_equal creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/power copying build/lib/pythran/pythonic/include/numpy/power/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/power creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/binomial.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/bytes.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/chisquare.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/choice.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/dirichlet.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/exponential.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/f.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/gamma.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/generator.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/geometric.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/gumbel.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/laplace.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/logistic.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/lognormal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/logseries.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/negative_binomial.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/normal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/pareto.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/poisson.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/power.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/rand.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/randint.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/randn.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/random.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/random_integers.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/random_sample.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/ranf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/rayleigh.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/sample.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/seed.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/shuffle.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/standard_exponential.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/standard_gamma.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/standard_normal.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/uniform.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random copying build/lib/pythran/pythonic/include/numpy/random/weibull.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/random creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/remainder copying build/lib/pythran/pythonic/include/numpy/remainder/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/remainder creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/right_shift copying build/lib/pythran/pythonic/include/numpy/right_shift/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/right_shift creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/subtract copying build/lib/pythran/pythonic/include/numpy/subtract/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/subtract creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/true_divide copying build/lib/pythran/pythonic/include/numpy/true_divide/accumulate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/numpy/true_divide creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/omp copying build/lib/pythran/pythonic/include/omp/get_num_threads.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/omp copying build/lib/pythran/pythonic/include/omp/get_thread_num.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/omp copying build/lib/pythran/pythonic/include/omp/get_wtick.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/omp copying build/lib/pythran/pythonic/include/omp/get_wtime.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/omp copying build/lib/pythran/pythonic/include/omp/in_parallel.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/omp copying build/lib/pythran/pythonic/include/omp/set_nested.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/omp creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__abs__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__add__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__and__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__concat__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__contains__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__delitem__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__div__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__eq__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__floordiv__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__ge__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__getitem__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__gt__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__iadd__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__iand__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__iconcat__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__idiv__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__ifloordiv__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__ilshift__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__imod__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__imul__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__inv__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__invert__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__ior__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__ipow__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__irshift__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__isub__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__itruediv__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__ixor__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__le__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__lshift__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__lt__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__matmul__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__mod__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__mul__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__ne__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__neg__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__not__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__or__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__pos__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__rshift__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__sub__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__truediv__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/__xor__.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/abs.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/add.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/and_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/concat.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/contains.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/countOf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/delitem.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/div.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/eq.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/floordiv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/ge.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/getitem.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/gt.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/iadd.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/iand.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/icommon.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/iconcat.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/idiv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/ifloordiv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/ilshift.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/imatmul.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/imax.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/imin.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/imod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/imul.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/indexOf.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/inv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/invert.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/ior.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/ipow.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/irshift.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/is_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/is_not.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/isub.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/itemgetter.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/itruediv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/ixor.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/le.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/lshift.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/lt.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/matmul.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/mod.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/mul.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/ne.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/neg.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/not_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/or_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/overloads.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/pos.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/pow.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/rshift.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/sub.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/truediv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/truth.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ copying build/lib/pythran/pythonic/include/operator_/xor_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/operator_ creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/random copying build/lib/pythran/pythonic/include/random/choice.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/random copying build/lib/pythran/pythonic/include/random/expovariate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/random copying build/lib/pythran/pythonic/include/random/gauss.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/random copying build/lib/pythran/pythonic/include/random/randint.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/random copying build/lib/pythran/pythonic/include/random/random.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/random copying build/lib/pythran/pythonic/include/random/randrange.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/random copying build/lib/pythran/pythonic/include/random/sample.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/random copying build/lib/pythran/pythonic/include/random/seed.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/random copying build/lib/pythran/pythonic/include/random/shuffle.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/random copying build/lib/pythran/pythonic/include/random/uniform.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/random creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/string copying build/lib/pythran/pythonic/include/string/ascii_letters.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/string copying build/lib/pythran/pythonic/include/string/ascii_lowercase.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/string copying build/lib/pythran/pythonic/include/string/ascii_uppercase.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/string copying build/lib/pythran/pythonic/include/string/digits.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/string copying build/lib/pythran/pythonic/include/string/find.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/string copying build/lib/pythran/pythonic/include/string/hexdigits.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/string copying build/lib/pythran/pythonic/include/string/octdigits.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/string creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/time copying build/lib/pythran/pythonic/include/time/sleep.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/time copying build/lib/pythran/pythonic/include/time/time.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/time creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/NoneType.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/assignable.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/attr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/bool.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/cfun.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/combined.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/complex.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/complex128.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/complex256.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/complex64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/dict.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/dynamic_tuple.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/empty_iterator.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/exceptions.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/file.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/finfo.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/float.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/float128.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/float32.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/float64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/generator.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/immediate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/int.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/int16.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/int32.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/int64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/int8.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/intc.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/intp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/lazy.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/list.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/ndarray.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/nditerator.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/numpy_binary_op.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/numpy_broadcast.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/numpy_expr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/numpy_gexpr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/numpy_iexpr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/numpy_nary_expr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/numpy_op_helper.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/numpy_operators.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/numpy_texpr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/numpy_unary_op.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/numpy_vexpr.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/pointer.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/raw_array.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/set.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/slice.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/static_if.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/str.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/traits.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/tuple.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/uint16.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/uint32.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/uint64.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/uint8.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/uintc.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/uintp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/variant_functor.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types copying build/lib/pythran/pythonic/include/types/vectorizable_type.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/types creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/array_helper.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/broadcast_copy.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/functor.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/fwd.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/int_.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/iterator.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/meta.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/nested_container.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/neutral.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/numpy_conversion.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/numpy_traits.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/reserve.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/seq.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/shared_ref.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/tags.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils copying build/lib/pythran/pythonic/include/utils/yield.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/utils creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/os creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/os/path copying build/lib/pythran/pythonic/include/os/path/join.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/os/path creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/binom.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/gamma.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/gammaln.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/hankel1.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/hankel2.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/i0.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/i0e.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/iv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/ivp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/jv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/jvp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/kv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/kvp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/spherical_jn.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/spherical_yn.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/yv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special copying build/lib/pythran/pythonic/include/scipy/special/yvp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/scipy/special creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/include/io/_io/TextIOWrapper/close.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/include/io/_io/TextIOWrapper/fileno.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/include/io/_io/TextIOWrapper/flush.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/include/io/_io/TextIOWrapper/isatty.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/include/io/_io/TextIOWrapper/next.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/include/io/_io/TextIOWrapper/read.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/include/io/_io/TextIOWrapper/readline.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/include/io/_io/TextIOWrapper/readlines.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/include/io/_io/TextIOWrapper/seek.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/include/io/_io/TextIOWrapper/tell.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/include/io/_io/TextIOWrapper/truncate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/include/io/_io/TextIOWrapper/write.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/include/io/_io/TextIOWrapper/writelines.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/include/io/_io/TextIOWrapper creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/os creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/os/path copying build/lib/pythran/pythonic/os/path/join.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/os/path creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/binom.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/chbevl.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/gamma.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/gammaln.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/hankel1.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/hankel2.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/i0.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/i0e.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/iv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/ivp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/jv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/jvp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/kv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/kvp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/spherical_jn.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/spherical_yn.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/yv.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special copying build/lib/pythran/pythonic/scipy/special/yvp.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/scipy/special creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/io creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/io/_io/TextIOWrapper/close.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/io/_io/TextIOWrapper/fileno.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/io/_io/TextIOWrapper/flush.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/io/_io/TextIOWrapper/isatty.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/io/_io/TextIOWrapper/next.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/io/_io/TextIOWrapper/read.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/io/_io/TextIOWrapper/readline.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/io/_io/TextIOWrapper/readlines.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/io/_io/TextIOWrapper/seek.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/io/_io/TextIOWrapper/tell.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/io/_io/TextIOWrapper/truncate.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/io/_io/TextIOWrapper/write.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io/TextIOWrapper copying build/lib/pythran/pythonic/io/_io/TextIOWrapper/writelines.hpp -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/io/_io/TextIOWrapper creating build/bdist.linux-ppc64le/wheel/pythran/pythonic/patch copying build/lib/pythran/pythonic/patch/README.rst -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/patch copying build/lib/pythran/pythonic/patch/complex -> build/bdist.linux-ppc64le/wheel/pythran/pythonic/patch creating build/bdist.linux-ppc64le/wheel/omp copying build/lib/omp/__init__.py -> build/bdist.linux-ppc64le/wheel/omp running install_egg_info running egg_info creating pythran.egg-info writing pythran.egg-info/PKG-INFO writing dependency_links to pythran.egg-info/dependency_links.txt writing entry points to pythran.egg-info/entry_points.txt writing requirements to pythran.egg-info/requires.txt writing top-level names to pythran.egg-info/top_level.txt writing manifest file 'pythran.egg-info/SOURCES.txt' reading manifest file 'pythran.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no files found matching '*' under directory 'third_party' adding license file 'LICENSE' adding license file 'AUTHORS' writing manifest file 'pythran.egg-info/SOURCES.txt' Copying pythran.egg-info to build/bdist.linux-ppc64le/wheel/pythran-0.11.0-py3.10.egg-info running install_scripts adding license file "LICENSE" (matched pattern "LICEN[CS]E*") adding license file "AUTHORS" (matched pattern "AUTHORS*") creating build/bdist.linux-ppc64le/wheel/pythran-0.11.0.dist-info/WHEEL creating '/builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir/pip-wheel-_hmbkwiy/tmpo2hktkl2/pythran-0.11.0-py3-none-any.whl' and adding 'build/bdist.linux-ppc64le/wheel' to it adding 'omp/__init__.py' adding 'pythran/__init__.py' adding 'pythran/backend.py' adding 'pythran/config.py' adding 'pythran/conversion.py' adding 'pythran/cxxgen.py' adding 'pythran/cxxtypes.py' adding 'pythran/dist.py' adding 'pythran/errors.py' adding 'pythran/frontend.py' adding 'pythran/graph.py' adding 'pythran/interval.py' adding 'pythran/intrinsic.py' adding 'pythran/log.py' adding 'pythran/magic.py' adding 'pythran/metadata.py' adding 'pythran/middlend.py' adding 'pythran/openmp.py' adding 'pythran/passmanager.py' adding 'pythran/pythran-darwin.cfg' adding 'pythran/pythran-default.cfg' adding 'pythran/pythran-linux.cfg' adding 'pythran/pythran-linux2.cfg' adding 'pythran/pythran-win32.cfg' adding 'pythran/pythran.cfg' adding 'pythran/run.py' adding 'pythran/spec.py' adding 'pythran/syntax.py' adding 'pythran/tables.py' adding 'pythran/toolchain.py' adding 'pythran/typing.py' adding 'pythran/unparse.py' adding 'pythran/utils.py' adding 'pythran/version.py' adding 'pythran/analyses/__init__.py' adding 'pythran/analyses/aliases.py' adding 'pythran/analyses/ancestors.py' adding 'pythran/analyses/argument_effects.py' adding 'pythran/analyses/argument_read_once.py' adding 'pythran/analyses/ast_matcher.py' adding 'pythran/analyses/cfg.py' adding 'pythran/analyses/constant_expressions.py' adding 'pythran/analyses/dependencies.py' adding 'pythran/analyses/extended_syntax_check.py' adding 'pythran/analyses/fixed_size_list.py' adding 'pythran/analyses/global_declarations.py' adding 'pythran/analyses/global_effects.py' adding 'pythran/analyses/globals_analysis.py' adding 'pythran/analyses/has_return.py' adding 'pythran/analyses/identifiers.py' adding 'pythran/analyses/immediates.py' adding 'pythran/analyses/imported_ids.py' adding 'pythran/analyses/inlinable.py' adding 'pythran/analyses/is_assigned.py' adding 'pythran/analyses/lazyness_analysis.py' adding 'pythran/analyses/literals.py' adding 'pythran/analyses/local_declarations.py' adding 'pythran/analyses/locals_analysis.py' adding 'pythran/analyses/node_count.py' adding 'pythran/analyses/optimizable_comprehension.py' adding 'pythran/analyses/ordered_global_declarations.py' adding 'pythran/analyses/parallel_maps.py' adding 'pythran/analyses/potential_iterator.py' adding 'pythran/analyses/pure_expressions.py' adding 'pythran/analyses/range_values.py' adding 'pythran/analyses/scope.py' adding 'pythran/analyses/static_expressions.py' adding 'pythran/analyses/use_def_chain.py' adding 'pythran/analyses/use_omp.py' adding 'pythran/analyses/yield_points.py' adding 'pythran/optimizations/__init__.py' adding 'pythran/optimizations/comprehension_patterns.py' adding 'pythran/optimizations/constant_folding.py' adding 'pythran/optimizations/dead_code_elimination.py' adding 'pythran/optimizations/forward_substitution.py' adding 'pythran/optimizations/inline_builtins.py' adding 'pythran/optimizations/inlining.py' adding 'pythran/optimizations/iter_transformation.py' adding 'pythran/optimizations/list_comp_to_genexp.py' adding 'pythran/optimizations/list_to_tuple.py' adding 'pythran/optimizations/loop_full_unrolling.py' adding 'pythran/optimizations/modindex.py' adding 'pythran/optimizations/pattern_transform.py' adding 'pythran/optimizations/range_based_simplify.py' adding 'pythran/optimizations/range_loop_unfolding.py' adding 'pythran/optimizations/remove_dead_functions.py' adding 'pythran/optimizations/simplify_except.py' adding 'pythran/optimizations/square.py' adding 'pythran/optimizations/tuple_to_shape.py' adding 'pythran/pythonic/core.hpp' adding 'pythran/pythonic/__dispatch__/clear.hpp' adding 'pythran/pythonic/__dispatch__/conjugate.hpp' adding 'pythran/pythonic/__dispatch__/copy.hpp' adding 'pythran/pythonic/__dispatch__/count.hpp' adding 'pythran/pythonic/__dispatch__/index.hpp' adding 'pythran/pythonic/__dispatch__/pop.hpp' adding 'pythran/pythonic/__dispatch__/remove.hpp' adding 'pythran/pythonic/__dispatch__/sort.hpp' adding 'pythran/pythonic/__dispatch__/update.hpp' adding 'pythran/pythonic/bisect/bisect.hpp' adding 'pythran/pythonic/bisect/bisect_left.hpp' adding 'pythran/pythonic/bisect/bisect_right.hpp' adding 'pythran/pythonic/builtins/ArithmeticError.hpp' adding 'pythran/pythonic/builtins/AssertionError.hpp' adding 'pythran/pythonic/builtins/AttributeError.hpp' adding 'pythran/pythonic/builtins/BaseException.hpp' adding 'pythran/pythonic/builtins/BufferError.hpp' adding 'pythran/pythonic/builtins/BytesWarning.hpp' adding 'pythran/pythonic/builtins/DeprecationWarning.hpp' adding 'pythran/pythonic/builtins/EOFError.hpp' adding 'pythran/pythonic/builtins/EnvironmentError.hpp' adding 'pythran/pythonic/builtins/Exception.hpp' adding 'pythran/pythonic/builtins/False.hpp' adding 'pythran/pythonic/builtins/FileNotFoundError.hpp' adding 'pythran/pythonic/builtins/FloatingPointError.hpp' adding 'pythran/pythonic/builtins/FutureWarning.hpp' adding 'pythran/pythonic/builtins/GeneratorExit.hpp' adding 'pythran/pythonic/builtins/IOError.hpp' adding 'pythran/pythonic/builtins/ImportError.hpp' adding 'pythran/pythonic/builtins/ImportWarning.hpp' adding 'pythran/pythonic/builtins/IndentationError.hpp' adding 'pythran/pythonic/builtins/IndexError.hpp' adding 'pythran/pythonic/builtins/KeyError.hpp' adding 'pythran/pythonic/builtins/KeyboardInterrupt.hpp' adding 'pythran/pythonic/builtins/LookupError.hpp' adding 'pythran/pythonic/builtins/MemoryError.hpp' adding 'pythran/pythonic/builtins/NameError.hpp' adding 'pythran/pythonic/builtins/None.hpp' adding 'pythran/pythonic/builtins/NotImplementedError.hpp' adding 'pythran/pythonic/builtins/OSError.hpp' adding 'pythran/pythonic/builtins/OverflowError.hpp' adding 'pythran/pythonic/builtins/PendingDeprecationWarning.hpp' adding 'pythran/pythonic/builtins/ReferenceError.hpp' adding 'pythran/pythonic/builtins/RuntimeError.hpp' adding 'pythran/pythonic/builtins/RuntimeWarning.hpp' adding 'pythran/pythonic/builtins/StopIteration.hpp' adding 'pythran/pythonic/builtins/SyntaxError.hpp' adding 'pythran/pythonic/builtins/SyntaxWarning.hpp' adding 'pythran/pythonic/builtins/SystemError.hpp' adding 'pythran/pythonic/builtins/SystemExit.hpp' adding 'pythran/pythonic/builtins/TabError.hpp' adding 'pythran/pythonic/builtins/True.hpp' adding 'pythran/pythonic/builtins/TypeError.hpp' adding 'pythran/pythonic/builtins/UnboundLocalError.hpp' adding 'pythran/pythonic/builtins/UnicodeError.hpp' adding 'pythran/pythonic/builtins/UnicodeWarning.hpp' adding 'pythran/pythonic/builtins/UserWarning.hpp' adding 'pythran/pythonic/builtins/ValueError.hpp' adding 'pythran/pythonic/builtins/Warning.hpp' adding 'pythran/pythonic/builtins/ZeroDivisionError.hpp' adding 'pythran/pythonic/builtins/abs.hpp' adding 'pythran/pythonic/builtins/all.hpp' adding 'pythran/pythonic/builtins/any.hpp' adding 'pythran/pythonic/builtins/assert.hpp' adding 'pythran/pythonic/builtins/bin.hpp' adding 'pythran/pythonic/builtins/bool_.hpp' adding 'pythran/pythonic/builtins/chr.hpp' adding 'pythran/pythonic/builtins/complex.hpp' adding 'pythran/pythonic/builtins/dict.hpp' adding 'pythran/pythonic/builtins/divmod.hpp' adding 'pythran/pythonic/builtins/enumerate.hpp' adding 'pythran/pythonic/builtins/file.hpp' adding 'pythran/pythonic/builtins/filter.hpp' adding 'pythran/pythonic/builtins/float_.hpp' adding 'pythran/pythonic/builtins/getattr.hpp' adding 'pythran/pythonic/builtins/hex.hpp' adding 'pythran/pythonic/builtins/id.hpp' adding 'pythran/pythonic/builtins/in.hpp' adding 'pythran/pythonic/builtins/int_.hpp' adding 'pythran/pythonic/builtins/isinstance.hpp' adding 'pythran/pythonic/builtins/iter.hpp' adding 'pythran/pythonic/builtins/len.hpp' adding 'pythran/pythonic/builtins/list.hpp' adding 'pythran/pythonic/builtins/map.hpp' adding 'pythran/pythonic/builtins/max.hpp' adding 'pythran/pythonic/builtins/min.hpp' adding 'pythran/pythonic/builtins/minmax.hpp' adding 'pythran/pythonic/builtins/next.hpp' adding 'pythran/pythonic/builtins/oct.hpp' adding 'pythran/pythonic/builtins/open.hpp' adding 'pythran/pythonic/builtins/ord.hpp' adding 'pythran/pythonic/builtins/pow.hpp' adding 'pythran/pythonic/builtins/print.hpp' adding 'pythran/pythonic/builtins/range.hpp' adding 'pythran/pythonic/builtins/reduce.hpp' adding 'pythran/pythonic/builtins/reversed.hpp' adding 'pythran/pythonic/builtins/round.hpp' adding 'pythran/pythonic/builtins/set.hpp' adding 'pythran/pythonic/builtins/slice.hpp' adding 'pythran/pythonic/builtins/sorted.hpp' adding 'pythran/pythonic/builtins/str.hpp' adding 'pythran/pythonic/builtins/sum.hpp' adding 'pythran/pythonic/builtins/tuple.hpp' adding 'pythran/pythonic/builtins/type.hpp' adding 'pythran/pythonic/builtins/xrange.hpp' adding 'pythran/pythonic/builtins/zip.hpp' adding 'pythran/pythonic/builtins/complex/conjugate.hpp' adding 'pythran/pythonic/builtins/dict/clear.hpp' adding 'pythran/pythonic/builtins/dict/copy.hpp' adding 'pythran/pythonic/builtins/dict/fromkeys.hpp' adding 'pythran/pythonic/builtins/dict/get.hpp' adding 'pythran/pythonic/builtins/dict/items.hpp' adding 'pythran/pythonic/builtins/dict/keys.hpp' adding 'pythran/pythonic/builtins/dict/pop.hpp' adding 'pythran/pythonic/builtins/dict/popitem.hpp' adding 'pythran/pythonic/builtins/dict/setdefault.hpp' adding 'pythran/pythonic/builtins/dict/update.hpp' adding 'pythran/pythonic/builtins/dict/values.hpp' adding 'pythran/pythonic/builtins/file/close.hpp' adding 'pythran/pythonic/builtins/file/fileno.hpp' adding 'pythran/pythonic/builtins/file/flush.hpp' adding 'pythran/pythonic/builtins/file/isatty.hpp' adding 'pythran/pythonic/builtins/file/next.hpp' adding 'pythran/pythonic/builtins/file/read.hpp' adding 'pythran/pythonic/builtins/file/readline.hpp' adding 'pythran/pythonic/builtins/file/readlines.hpp' adding 'pythran/pythonic/builtins/file/seek.hpp' adding 'pythran/pythonic/builtins/file/tell.hpp' adding 'pythran/pythonic/builtins/file/truncate.hpp' adding 'pythran/pythonic/builtins/file/write.hpp' adding 'pythran/pythonic/builtins/file/writelines.hpp' adding 'pythran/pythonic/builtins/float_/is_integer.hpp' adding 'pythran/pythonic/builtins/list/append.hpp' adding 'pythran/pythonic/builtins/list/count.hpp' adding 'pythran/pythonic/builtins/list/extend.hpp' adding 'pythran/pythonic/builtins/list/insert.hpp' adding 'pythran/pythonic/builtins/list/pop.hpp' adding 'pythran/pythonic/builtins/list/remove.hpp' adding 'pythran/pythonic/builtins/list/reverse.hpp' adding 'pythran/pythonic/builtins/list/sort.hpp' adding 'pythran/pythonic/builtins/pythran/StaticIfBreak.hpp' adding 'pythran/pythonic/builtins/pythran/StaticIfCont.hpp' adding 'pythran/pythonic/builtins/pythran/StaticIfNoReturn.hpp' adding 'pythran/pythonic/builtins/pythran/StaticIfReturn.hpp' adding 'pythran/pythonic/builtins/pythran/abssqr.hpp' adding 'pythran/pythonic/builtins/pythran/and_.hpp' adding 'pythran/pythonic/builtins/pythran/is_none.hpp' adding 'pythran/pythonic/builtins/pythran/kwonly.hpp' adding 'pythran/pythonic/builtins/pythran/len_set.hpp' adding 'pythran/pythonic/builtins/pythran/make_shape.hpp' adding 'pythran/pythonic/builtins/pythran/or_.hpp' adding 'pythran/pythonic/builtins/pythran/static_if.hpp' adding 'pythran/pythonic/builtins/pythran/static_list.hpp' adding 'pythran/pythonic/builtins/set/add.hpp' adding 'pythran/pythonic/builtins/set/clear.hpp' adding 'pythran/pythonic/builtins/set/copy.hpp' adding 'pythran/pythonic/builtins/set/difference.hpp' adding 'pythran/pythonic/builtins/set/difference_update.hpp' adding 'pythran/pythonic/builtins/set/discard.hpp' adding 'pythran/pythonic/builtins/set/intersection.hpp' adding 'pythran/pythonic/builtins/set/intersection_update.hpp' adding 'pythran/pythonic/builtins/set/isdisjoint.hpp' adding 'pythran/pythonic/builtins/set/issubset.hpp' adding 'pythran/pythonic/builtins/set/issuperset.hpp' adding 'pythran/pythonic/builtins/set/remove.hpp' adding 'pythran/pythonic/builtins/set/symmetric_difference.hpp' adding 'pythran/pythonic/builtins/set/symmetric_difference_update.hpp' adding 'pythran/pythonic/builtins/set/union_.hpp' adding 'pythran/pythonic/builtins/set/update.hpp' adding 'pythran/pythonic/builtins/str/__mod__.hpp' adding 'pythran/pythonic/builtins/str/capitalize.hpp' adding 'pythran/pythonic/builtins/str/count.hpp' adding 'pythran/pythonic/builtins/str/endswith.hpp' adding 'pythran/pythonic/builtins/str/find.hpp' adding 'pythran/pythonic/builtins/str/isalpha.hpp' adding 'pythran/pythonic/builtins/str/isdigit.hpp' adding 'pythran/pythonic/builtins/str/join.hpp' adding 'pythran/pythonic/builtins/str/lower.hpp' adding 'pythran/pythonic/builtins/str/lstrip.hpp' adding 'pythran/pythonic/builtins/str/replace.hpp' adding 'pythran/pythonic/builtins/str/rstrip.hpp' adding 'pythran/pythonic/builtins/str/split.hpp' adding 'pythran/pythonic/builtins/str/startswith.hpp' adding 'pythran/pythonic/builtins/str/strip.hpp' adding 'pythran/pythonic/builtins/str/upper.hpp' adding 'pythran/pythonic/cmath/acos.hpp' adding 'pythran/pythonic/cmath/acosh.hpp' adding 'pythran/pythonic/cmath/asin.hpp' adding 'pythran/pythonic/cmath/asinh.hpp' adding 'pythran/pythonic/cmath/atan.hpp' adding 'pythran/pythonic/cmath/atanh.hpp' adding 'pythran/pythonic/cmath/cos.hpp' adding 'pythran/pythonic/cmath/cosh.hpp' adding 'pythran/pythonic/cmath/e.hpp' adding 'pythran/pythonic/cmath/exp.hpp' adding 'pythran/pythonic/cmath/isinf.hpp' adding 'pythran/pythonic/cmath/isnan.hpp' adding 'pythran/pythonic/cmath/log.hpp' adding 'pythran/pythonic/cmath/log10.hpp' adding 'pythran/pythonic/cmath/pi.hpp' adding 'pythran/pythonic/cmath/sin.hpp' adding 'pythran/pythonic/cmath/sinh.hpp' adding 'pythran/pythonic/cmath/sqrt.hpp' adding 'pythran/pythonic/cmath/tan.hpp' adding 'pythran/pythonic/cmath/tanh.hpp' adding 'pythran/pythonic/functools/partial.hpp' adding 'pythran/pythonic/functools/reduce.hpp' adding 'pythran/pythonic/include/__dispatch__/clear.hpp' adding 'pythran/pythonic/include/__dispatch__/conjugate.hpp' adding 'pythran/pythonic/include/__dispatch__/copy.hpp' adding 'pythran/pythonic/include/__dispatch__/count.hpp' adding 'pythran/pythonic/include/__dispatch__/index.hpp' adding 'pythran/pythonic/include/__dispatch__/pop.hpp' adding 'pythran/pythonic/include/__dispatch__/remove.hpp' adding 'pythran/pythonic/include/__dispatch__/sort.hpp' adding 'pythran/pythonic/include/__dispatch__/update.hpp' adding 'pythran/pythonic/include/bisect/bisect.hpp' adding 'pythran/pythonic/include/bisect/bisect_left.hpp' adding 'pythran/pythonic/include/bisect/bisect_right.hpp' adding 'pythran/pythonic/include/builtins/ArithmeticError.hpp' adding 'pythran/pythonic/include/builtins/AssertionError.hpp' adding 'pythran/pythonic/include/builtins/AttributeError.hpp' adding 'pythran/pythonic/include/builtins/BaseException.hpp' adding 'pythran/pythonic/include/builtins/BufferError.hpp' adding 'pythran/pythonic/include/builtins/BytesWarning.hpp' adding 'pythran/pythonic/include/builtins/DeprecationWarning.hpp' adding 'pythran/pythonic/include/builtins/EOFError.hpp' adding 'pythran/pythonic/include/builtins/EnvironmentError.hpp' adding 'pythran/pythonic/include/builtins/Exception.hpp' adding 'pythran/pythonic/include/builtins/False.hpp' adding 'pythran/pythonic/include/builtins/FileNotFoundError.hpp' adding 'pythran/pythonic/include/builtins/FloatingPointError.hpp' adding 'pythran/pythonic/include/builtins/FutureWarning.hpp' adding 'pythran/pythonic/include/builtins/GeneratorExit.hpp' adding 'pythran/pythonic/include/builtins/IOError.hpp' adding 'pythran/pythonic/include/builtins/ImportError.hpp' adding 'pythran/pythonic/include/builtins/ImportWarning.hpp' adding 'pythran/pythonic/include/builtins/IndentationError.hpp' adding 'pythran/pythonic/include/builtins/IndexError.hpp' adding 'pythran/pythonic/include/builtins/KeyError.hpp' adding 'pythran/pythonic/include/builtins/KeyboardInterrupt.hpp' adding 'pythran/pythonic/include/builtins/LookupError.hpp' adding 'pythran/pythonic/include/builtins/MemoryError.hpp' adding 'pythran/pythonic/include/builtins/NameError.hpp' adding 'pythran/pythonic/include/builtins/None.hpp' adding 'pythran/pythonic/include/builtins/NotImplementedError.hpp' adding 'pythran/pythonic/include/builtins/OSError.hpp' adding 'pythran/pythonic/include/builtins/OverflowError.hpp' adding 'pythran/pythonic/include/builtins/PendingDeprecationWarning.hpp' adding 'pythran/pythonic/include/builtins/ReferenceError.hpp' adding 'pythran/pythonic/include/builtins/RuntimeError.hpp' adding 'pythran/pythonic/include/builtins/RuntimeWarning.hpp' adding 'pythran/pythonic/include/builtins/StopIteration.hpp' adding 'pythran/pythonic/include/builtins/SyntaxError.hpp' adding 'pythran/pythonic/include/builtins/SyntaxWarning.hpp' adding 'pythran/pythonic/include/builtins/SystemError.hpp' adding 'pythran/pythonic/include/builtins/SystemExit.hpp' adding 'pythran/pythonic/include/builtins/TabError.hpp' adding 'pythran/pythonic/include/builtins/True.hpp' adding 'pythran/pythonic/include/builtins/TypeError.hpp' adding 'pythran/pythonic/include/builtins/UnboundLocalError.hpp' adding 'pythran/pythonic/include/builtins/UnicodeError.hpp' adding 'pythran/pythonic/include/builtins/UnicodeWarning.hpp' adding 'pythran/pythonic/include/builtins/UserWarning.hpp' adding 'pythran/pythonic/include/builtins/ValueError.hpp' adding 'pythran/pythonic/include/builtins/Warning.hpp' adding 'pythran/pythonic/include/builtins/ZeroDivisionError.hpp' adding 'pythran/pythonic/include/builtins/abs.hpp' adding 'pythran/pythonic/include/builtins/all.hpp' adding 'pythran/pythonic/include/builtins/any.hpp' adding 'pythran/pythonic/include/builtins/assert.hpp' adding 'pythran/pythonic/include/builtins/bin.hpp' adding 'pythran/pythonic/include/builtins/bool_.hpp' adding 'pythran/pythonic/include/builtins/chr.hpp' adding 'pythran/pythonic/include/builtins/complex.hpp' adding 'pythran/pythonic/include/builtins/dict.hpp' adding 'pythran/pythonic/include/builtins/divmod.hpp' adding 'pythran/pythonic/include/builtins/enumerate.hpp' adding 'pythran/pythonic/include/builtins/file.hpp' adding 'pythran/pythonic/include/builtins/filter.hpp' adding 'pythran/pythonic/include/builtins/float_.hpp' adding 'pythran/pythonic/include/builtins/getattr.hpp' adding 'pythran/pythonic/include/builtins/hex.hpp' adding 'pythran/pythonic/include/builtins/id.hpp' adding 'pythran/pythonic/include/builtins/in.hpp' adding 'pythran/pythonic/include/builtins/int_.hpp' adding 'pythran/pythonic/include/builtins/isinstance.hpp' adding 'pythran/pythonic/include/builtins/iter.hpp' adding 'pythran/pythonic/include/builtins/len.hpp' adding 'pythran/pythonic/include/builtins/list.hpp' adding 'pythran/pythonic/include/builtins/map.hpp' adding 'pythran/pythonic/include/builtins/max.hpp' adding 'pythran/pythonic/include/builtins/min.hpp' adding 'pythran/pythonic/include/builtins/minmax.hpp' adding 'pythran/pythonic/include/builtins/next.hpp' adding 'pythran/pythonic/include/builtins/oct.hpp' adding 'pythran/pythonic/include/builtins/open.hpp' adding 'pythran/pythonic/include/builtins/ord.hpp' adding 'pythran/pythonic/include/builtins/pow.hpp' adding 'pythran/pythonic/include/builtins/print.hpp' adding 'pythran/pythonic/include/builtins/range.hpp' adding 'pythran/pythonic/include/builtins/reduce.hpp' adding 'pythran/pythonic/include/builtins/reversed.hpp' adding 'pythran/pythonic/include/builtins/round.hpp' adding 'pythran/pythonic/include/builtins/set.hpp' adding 'pythran/pythonic/include/builtins/slice.hpp' adding 'pythran/pythonic/include/builtins/sorted.hpp' adding 'pythran/pythonic/include/builtins/str.hpp' adding 'pythran/pythonic/include/builtins/sum.hpp' adding 'pythran/pythonic/include/builtins/tuple.hpp' adding 'pythran/pythonic/include/builtins/type.hpp' adding 'pythran/pythonic/include/builtins/xrange.hpp' adding 'pythran/pythonic/include/builtins/zip.hpp' adding 'pythran/pythonic/include/builtins/complex/conjugate.hpp' adding 'pythran/pythonic/include/builtins/dict/clear.hpp' adding 'pythran/pythonic/include/builtins/dict/copy.hpp' adding 'pythran/pythonic/include/builtins/dict/fromkeys.hpp' adding 'pythran/pythonic/include/builtins/dict/get.hpp' adding 'pythran/pythonic/include/builtins/dict/items.hpp' adding 'pythran/pythonic/include/builtins/dict/keys.hpp' adding 'pythran/pythonic/include/builtins/dict/pop.hpp' adding 'pythran/pythonic/include/builtins/dict/popitem.hpp' adding 'pythran/pythonic/include/builtins/dict/setdefault.hpp' adding 'pythran/pythonic/include/builtins/dict/update.hpp' adding 'pythran/pythonic/include/builtins/dict/values.hpp' adding 'pythran/pythonic/include/builtins/file/close.hpp' adding 'pythran/pythonic/include/builtins/file/fileno.hpp' adding 'pythran/pythonic/include/builtins/file/flush.hpp' adding 'pythran/pythonic/include/builtins/file/isatty.hpp' adding 'pythran/pythonic/include/builtins/file/next.hpp' adding 'pythran/pythonic/include/builtins/file/read.hpp' adding 'pythran/pythonic/include/builtins/file/readline.hpp' adding 'pythran/pythonic/include/builtins/file/readlines.hpp' adding 'pythran/pythonic/include/builtins/file/seek.hpp' adding 'pythran/pythonic/include/builtins/file/tell.hpp' adding 'pythran/pythonic/include/builtins/file/truncate.hpp' adding 'pythran/pythonic/include/builtins/file/write.hpp' adding 'pythran/pythonic/include/builtins/file/writelines.hpp' adding 'pythran/pythonic/include/builtins/float_/is_integer.hpp' adding 'pythran/pythonic/include/builtins/list/append.hpp' adding 'pythran/pythonic/include/builtins/list/count.hpp' adding 'pythran/pythonic/include/builtins/list/extend.hpp' adding 'pythran/pythonic/include/builtins/list/insert.hpp' adding 'pythran/pythonic/include/builtins/list/pop.hpp' adding 'pythran/pythonic/include/builtins/list/remove.hpp' adding 'pythran/pythonic/include/builtins/list/reverse.hpp' adding 'pythran/pythonic/include/builtins/list/sort.hpp' adding 'pythran/pythonic/include/builtins/pythran/StaticIfBreak.hpp' adding 'pythran/pythonic/include/builtins/pythran/StaticIfCont.hpp' adding 'pythran/pythonic/include/builtins/pythran/StaticIfNoReturn.hpp' adding 'pythran/pythonic/include/builtins/pythran/StaticIfReturn.hpp' adding 'pythran/pythonic/include/builtins/pythran/abssqr.hpp' adding 'pythran/pythonic/include/builtins/pythran/and_.hpp' adding 'pythran/pythonic/include/builtins/pythran/is_none.hpp' adding 'pythran/pythonic/include/builtins/pythran/kwonly.hpp' adding 'pythran/pythonic/include/builtins/pythran/len_set.hpp' adding 'pythran/pythonic/include/builtins/pythran/make_shape.hpp' adding 'pythran/pythonic/include/builtins/pythran/or_.hpp' adding 'pythran/pythonic/include/builtins/pythran/static_if.hpp' adding 'pythran/pythonic/include/builtins/pythran/static_list.hpp' adding 'pythran/pythonic/include/builtins/set/add.hpp' adding 'pythran/pythonic/include/builtins/set/clear.hpp' adding 'pythran/pythonic/include/builtins/set/copy.hpp' adding 'pythran/pythonic/include/builtins/set/difference.hpp' adding 'pythran/pythonic/include/builtins/set/difference_update.hpp' adding 'pythran/pythonic/include/builtins/set/discard.hpp' adding 'pythran/pythonic/include/builtins/set/intersection.hpp' adding 'pythran/pythonic/include/builtins/set/intersection_update.hpp' adding 'pythran/pythonic/include/builtins/set/isdisjoint.hpp' adding 'pythran/pythonic/include/builtins/set/issubset.hpp' adding 'pythran/pythonic/include/builtins/set/issuperset.hpp' adding 'pythran/pythonic/include/builtins/set/remove.hpp' adding 'pythran/pythonic/include/builtins/set/symmetric_difference.hpp' adding 'pythran/pythonic/include/builtins/set/symmetric_difference_update.hpp' adding 'pythran/pythonic/include/builtins/set/union_.hpp' adding 'pythran/pythonic/include/builtins/set/update.hpp' adding 'pythran/pythonic/include/builtins/str/__mod__.hpp' adding 'pythran/pythonic/include/builtins/str/capitalize.hpp' adding 'pythran/pythonic/include/builtins/str/count.hpp' adding 'pythran/pythonic/include/builtins/str/endswith.hpp' adding 'pythran/pythonic/include/builtins/str/find.hpp' adding 'pythran/pythonic/include/builtins/str/isalpha.hpp' adding 'pythran/pythonic/include/builtins/str/isdigit.hpp' adding 'pythran/pythonic/include/builtins/str/join.hpp' adding 'pythran/pythonic/include/builtins/str/lower.hpp' adding 'pythran/pythonic/include/builtins/str/lstrip.hpp' adding 'pythran/pythonic/include/builtins/str/replace.hpp' adding 'pythran/pythonic/include/builtins/str/rstrip.hpp' adding 'pythran/pythonic/include/builtins/str/split.hpp' adding 'pythran/pythonic/include/builtins/str/startswith.hpp' adding 'pythran/pythonic/include/builtins/str/strip.hpp' adding 'pythran/pythonic/include/builtins/str/upper.hpp' adding 'pythran/pythonic/include/cmath/acos.hpp' adding 'pythran/pythonic/include/cmath/acosh.hpp' adding 'pythran/pythonic/include/cmath/asin.hpp' adding 'pythran/pythonic/include/cmath/asinh.hpp' adding 'pythran/pythonic/include/cmath/atan.hpp' adding 'pythran/pythonic/include/cmath/atanh.hpp' adding 'pythran/pythonic/include/cmath/cos.hpp' adding 'pythran/pythonic/include/cmath/cosh.hpp' adding 'pythran/pythonic/include/cmath/e.hpp' adding 'pythran/pythonic/include/cmath/exp.hpp' adding 'pythran/pythonic/include/cmath/isinf.hpp' adding 'pythran/pythonic/include/cmath/isnan.hpp' adding 'pythran/pythonic/include/cmath/log.hpp' adding 'pythran/pythonic/include/cmath/log10.hpp' adding 'pythran/pythonic/include/cmath/pi.hpp' adding 'pythran/pythonic/include/cmath/sin.hpp' adding 'pythran/pythonic/include/cmath/sinh.hpp' adding 'pythran/pythonic/include/cmath/sqrt.hpp' adding 'pythran/pythonic/include/cmath/tan.hpp' adding 'pythran/pythonic/include/cmath/tanh.hpp' adding 'pythran/pythonic/include/functools/partial.hpp' adding 'pythran/pythonic/include/functools/reduce.hpp' adding 'pythran/pythonic/include/io/_io/TextIOWrapper/close.hpp' adding 'pythran/pythonic/include/io/_io/TextIOWrapper/fileno.hpp' adding 'pythran/pythonic/include/io/_io/TextIOWrapper/flush.hpp' adding 'pythran/pythonic/include/io/_io/TextIOWrapper/isatty.hpp' adding 'pythran/pythonic/include/io/_io/TextIOWrapper/next.hpp' adding 'pythran/pythonic/include/io/_io/TextIOWrapper/read.hpp' adding 'pythran/pythonic/include/io/_io/TextIOWrapper/readline.hpp' adding 'pythran/pythonic/include/io/_io/TextIOWrapper/readlines.hpp' adding 'pythran/pythonic/include/io/_io/TextIOWrapper/seek.hpp' adding 'pythran/pythonic/include/io/_io/TextIOWrapper/tell.hpp' adding 'pythran/pythonic/include/io/_io/TextIOWrapper/truncate.hpp' adding 'pythran/pythonic/include/io/_io/TextIOWrapper/write.hpp' adding 'pythran/pythonic/include/io/_io/TextIOWrapper/writelines.hpp' adding 'pythran/pythonic/include/itertools/combinations.hpp' adding 'pythran/pythonic/include/itertools/common.hpp' adding 'pythran/pythonic/include/itertools/count.hpp' adding 'pythran/pythonic/include/itertools/ifilter.hpp' adding 'pythran/pythonic/include/itertools/islice.hpp' adding 'pythran/pythonic/include/itertools/permutations.hpp' adding 'pythran/pythonic/include/itertools/product.hpp' adding 'pythran/pythonic/include/itertools/repeat.hpp' adding 'pythran/pythonic/include/math/acos.hpp' adding 'pythran/pythonic/include/math/acosh.hpp' adding 'pythran/pythonic/include/math/asin.hpp' adding 'pythran/pythonic/include/math/asinh.hpp' adding 'pythran/pythonic/include/math/atan.hpp' adding 'pythran/pythonic/include/math/atan2.hpp' adding 'pythran/pythonic/include/math/atanh.hpp' adding 'pythran/pythonic/include/math/ceil.hpp' adding 'pythran/pythonic/include/math/copysign.hpp' adding 'pythran/pythonic/include/math/cos.hpp' adding 'pythran/pythonic/include/math/cosh.hpp' adding 'pythran/pythonic/include/math/degrees.hpp' adding 'pythran/pythonic/include/math/e.hpp' adding 'pythran/pythonic/include/math/erf.hpp' adding 'pythran/pythonic/include/math/erfc.hpp' adding 'pythran/pythonic/include/math/exp.hpp' adding 'pythran/pythonic/include/math/expm1.hpp' adding 'pythran/pythonic/include/math/fabs.hpp' adding 'pythran/pythonic/include/math/factorial.hpp' adding 'pythran/pythonic/include/math/floor.hpp' adding 'pythran/pythonic/include/math/fmod.hpp' adding 'pythran/pythonic/include/math/frexp.hpp' adding 'pythran/pythonic/include/math/gamma.hpp' adding 'pythran/pythonic/include/math/hypot.hpp' adding 'pythran/pythonic/include/math/isinf.hpp' adding 'pythran/pythonic/include/math/isnan.hpp' adding 'pythran/pythonic/include/math/ldexp.hpp' adding 'pythran/pythonic/include/math/lgamma.hpp' adding 'pythran/pythonic/include/math/log.hpp' adding 'pythran/pythonic/include/math/log10.hpp' adding 'pythran/pythonic/include/math/log1p.hpp' adding 'pythran/pythonic/include/math/modf.hpp' adding 'pythran/pythonic/include/math/pi.hpp' adding 'pythran/pythonic/include/math/pow.hpp' adding 'pythran/pythonic/include/math/radians.hpp' adding 'pythran/pythonic/include/math/sin.hpp' adding 'pythran/pythonic/include/math/sinh.hpp' adding 'pythran/pythonic/include/math/sqrt.hpp' adding 'pythran/pythonic/include/math/tan.hpp' adding 'pythran/pythonic/include/math/tanh.hpp' adding 'pythran/pythonic/include/math/trunc.hpp' adding 'pythran/pythonic/include/numpy/NINF.hpp' adding 'pythran/pythonic/include/numpy/abs.hpp' adding 'pythran/pythonic/include/numpy/absolute.hpp' adding 'pythran/pythonic/include/numpy/add.hpp' adding 'pythran/pythonic/include/numpy/alen.hpp' adding 'pythran/pythonic/include/numpy/all.hpp' adding 'pythran/pythonic/include/numpy/allclose.hpp' adding 'pythran/pythonic/include/numpy/alltrue.hpp' adding 'pythran/pythonic/include/numpy/amax.hpp' adding 'pythran/pythonic/include/numpy/amin.hpp' adding 'pythran/pythonic/include/numpy/angle.hpp' adding 'pythran/pythonic/include/numpy/angle_in_deg.hpp' adding 'pythran/pythonic/include/numpy/angle_in_rad.hpp' adding 'pythran/pythonic/include/numpy/any.hpp' adding 'pythran/pythonic/include/numpy/append.hpp' adding 'pythran/pythonic/include/numpy/arange.hpp' adding 'pythran/pythonic/include/numpy/arccos.hpp' adding 'pythran/pythonic/include/numpy/arccosh.hpp' adding 'pythran/pythonic/include/numpy/arcsin.hpp' adding 'pythran/pythonic/include/numpy/arcsinh.hpp' adding 'pythran/pythonic/include/numpy/arctan.hpp' adding 'pythran/pythonic/include/numpy/arctan2.hpp' adding 'pythran/pythonic/include/numpy/arctanh.hpp' adding 'pythran/pythonic/include/numpy/argmax.hpp' adding 'pythran/pythonic/include/numpy/argmin.hpp' adding 'pythran/pythonic/include/numpy/argsort.hpp' adding 'pythran/pythonic/include/numpy/argwhere.hpp' adding 'pythran/pythonic/include/numpy/around.hpp' adding 'pythran/pythonic/include/numpy/array.hpp' adding 'pythran/pythonic/include/numpy/array2string.hpp' adding 'pythran/pythonic/include/numpy/array_equal.hpp' adding 'pythran/pythonic/include/numpy/array_equiv.hpp' adding 'pythran/pythonic/include/numpy/array_split.hpp' adding 'pythran/pythonic/include/numpy/array_str.hpp' adding 'pythran/pythonic/include/numpy/asarray.hpp' adding 'pythran/pythonic/include/numpy/asarray_chkfinite.hpp' adding 'pythran/pythonic/include/numpy/ascontiguousarray.hpp' adding 'pythran/pythonic/include/numpy/asfarray.hpp' adding 'pythran/pythonic/include/numpy/asscalar.hpp' adding 'pythran/pythonic/include/numpy/atleast_1d.hpp' adding 'pythran/pythonic/include/numpy/atleast_2d.hpp' adding 'pythran/pythonic/include/numpy/atleast_3d.hpp' adding 'pythran/pythonic/include/numpy/average.hpp' adding 'pythran/pythonic/include/numpy/base_repr.hpp' adding 'pythran/pythonic/include/numpy/binary_repr.hpp' adding 'pythran/pythonic/include/numpy/bincount.hpp' adding 'pythran/pythonic/include/numpy/bitwise_and.hpp' adding 'pythran/pythonic/include/numpy/bitwise_not.hpp' adding 'pythran/pythonic/include/numpy/bitwise_or.hpp' adding 'pythran/pythonic/include/numpy/bitwise_xor.hpp' adding 'pythran/pythonic/include/numpy/bool_.hpp' adding 'pythran/pythonic/include/numpy/broadcast_to.hpp' adding 'pythran/pythonic/include/numpy/byte.hpp' adding 'pythran/pythonic/include/numpy/cbrt.hpp' adding 'pythran/pythonic/include/numpy/ceil.hpp' adding 'pythran/pythonic/include/numpy/clip.hpp' adding 'pythran/pythonic/include/numpy/complex.hpp' adding 'pythran/pythonic/include/numpy/complex128.hpp' adding 'pythran/pythonic/include/numpy/complex256.hpp' adding 'pythran/pythonic/include/numpy/complex64.hpp' adding 'pythran/pythonic/include/numpy/concatenate.hpp' adding 'pythran/pythonic/include/numpy/conj.hpp' adding 'pythran/pythonic/include/numpy/conjugate.hpp' adding 'pythran/pythonic/include/numpy/convolve.hpp' adding 'pythran/pythonic/include/numpy/copy.hpp' adding 'pythran/pythonic/include/numpy/copysign.hpp' adding 'pythran/pythonic/include/numpy/copyto.hpp' adding 'pythran/pythonic/include/numpy/correlate.hpp' adding 'pythran/pythonic/include/numpy/cos.hpp' adding 'pythran/pythonic/include/numpy/cosh.hpp' adding 'pythran/pythonic/include/numpy/count_nonzero.hpp' adding 'pythran/pythonic/include/numpy/cross.hpp' adding 'pythran/pythonic/include/numpy/cumprod.hpp' adding 'pythran/pythonic/include/numpy/cumproduct.hpp' adding 'pythran/pythonic/include/numpy/cumsum.hpp' adding 'pythran/pythonic/include/numpy/deg2rad.hpp' adding 'pythran/pythonic/include/numpy/degrees.hpp' adding 'pythran/pythonic/include/numpy/delete_.hpp' adding 'pythran/pythonic/include/numpy/diag.hpp' adding 'pythran/pythonic/include/numpy/diagflat.hpp' adding 'pythran/pythonic/include/numpy/diagonal.hpp' adding 'pythran/pythonic/include/numpy/diff.hpp' adding 'pythran/pythonic/include/numpy/digitize.hpp' adding 'pythran/pythonic/include/numpy/divide.hpp' adding 'pythran/pythonic/include/numpy/dot.hpp' adding 'pythran/pythonic/include/numpy/double_.hpp' adding 'pythran/pythonic/include/numpy/e.hpp' adding 'pythran/pythonic/include/numpy/ediff1d.hpp' adding 'pythran/pythonic/include/numpy/empty.hpp' adding 'pythran/pythonic/include/numpy/empty_like.hpp' adding 'pythran/pythonic/include/numpy/equal.hpp' adding 'pythran/pythonic/include/numpy/exp.hpp' adding 'pythran/pythonic/include/numpy/expand_dims.hpp' adding 'pythran/pythonic/include/numpy/expm1.hpp' adding 'pythran/pythonic/include/numpy/eye.hpp' adding 'pythran/pythonic/include/numpy/fabs.hpp' adding 'pythran/pythonic/include/numpy/fill_diagonal.hpp' adding 'pythran/pythonic/include/numpy/finfo.hpp' adding 'pythran/pythonic/include/numpy/fix.hpp' adding 'pythran/pythonic/include/numpy/flatnonzero.hpp' adding 'pythran/pythonic/include/numpy/flip.hpp' adding 'pythran/pythonic/include/numpy/fliplr.hpp' adding 'pythran/pythonic/include/numpy/flipud.hpp' adding 'pythran/pythonic/include/numpy/float128.hpp' adding 'pythran/pythonic/include/numpy/float32.hpp' adding 'pythran/pythonic/include/numpy/float64.hpp' adding 'pythran/pythonic/include/numpy/float_.hpp' adding 'pythran/pythonic/include/numpy/floor.hpp' adding 'pythran/pythonic/include/numpy/floor_divide.hpp' adding 'pythran/pythonic/include/numpy/fmax.hpp' adding 'pythran/pythonic/include/numpy/fmin.hpp' adding 'pythran/pythonic/include/numpy/fmod.hpp' adding 'pythran/pythonic/include/numpy/frexp.hpp' adding 'pythran/pythonic/include/numpy/fromfile.hpp' adding 'pythran/pythonic/include/numpy/fromfunction.hpp' adding 'pythran/pythonic/include/numpy/fromiter.hpp' adding 'pythran/pythonic/include/numpy/fromstring.hpp' adding 'pythran/pythonic/include/numpy/full.hpp' adding 'pythran/pythonic/include/numpy/full_like.hpp' adding 'pythran/pythonic/include/numpy/greater.hpp' adding 'pythran/pythonic/include/numpy/greater_equal.hpp' adding 'pythran/pythonic/include/numpy/heaviside.hpp' adding 'pythran/pythonic/include/numpy/hstack.hpp' adding 'pythran/pythonic/include/numpy/hypot.hpp' adding 'pythran/pythonic/include/numpy/identity.hpp' adding 'pythran/pythonic/include/numpy/imag.hpp' adding 'pythran/pythonic/include/numpy/indices.hpp' adding 'pythran/pythonic/include/numpy/inf.hpp' adding 'pythran/pythonic/include/numpy/inner.hpp' adding 'pythran/pythonic/include/numpy/insert.hpp' adding 'pythran/pythonic/include/numpy/int16.hpp' adding 'pythran/pythonic/include/numpy/int32.hpp' adding 'pythran/pythonic/include/numpy/int64.hpp' adding 'pythran/pythonic/include/numpy/int8.hpp' adding 'pythran/pythonic/include/numpy/int_.hpp' adding 'pythran/pythonic/include/numpy/intc.hpp' adding 'pythran/pythonic/include/numpy/interp.hpp' adding 'pythran/pythonic/include/numpy/intersect1d.hpp' adding 'pythran/pythonic/include/numpy/intp.hpp' adding 'pythran/pythonic/include/numpy/invert.hpp' adding 'pythran/pythonic/include/numpy/isclose.hpp' adding 'pythran/pythonic/include/numpy/iscomplex.hpp' adding 'pythran/pythonic/include/numpy/isfinite.hpp' adding 'pythran/pythonic/include/numpy/isinf.hpp' adding 'pythran/pythonic/include/numpy/isnan.hpp' adding 'pythran/pythonic/include/numpy/isneginf.hpp' adding 'pythran/pythonic/include/numpy/isposinf.hpp' adding 'pythran/pythonic/include/numpy/isreal.hpp' adding 'pythran/pythonic/include/numpy/isrealobj.hpp' adding 'pythran/pythonic/include/numpy/isscalar.hpp' adding 'pythran/pythonic/include/numpy/issctype.hpp' adding 'pythran/pythonic/include/numpy/ldexp.hpp' adding 'pythran/pythonic/include/numpy/left_shift.hpp' adding 'pythran/pythonic/include/numpy/less.hpp' adding 'pythran/pythonic/include/numpy/less_equal.hpp' adding 'pythran/pythonic/include/numpy/lexsort.hpp' adding 'pythran/pythonic/include/numpy/linspace.hpp' adding 'pythran/pythonic/include/numpy/log.hpp' adding 'pythran/pythonic/include/numpy/log10.hpp' adding 'pythran/pythonic/include/numpy/log1p.hpp' adding 'pythran/pythonic/include/numpy/log2.hpp' adding 'pythran/pythonic/include/numpy/logaddexp.hpp' adding 'pythran/pythonic/include/numpy/logaddexp2.hpp' adding 'pythran/pythonic/include/numpy/logical_and.hpp' adding 'pythran/pythonic/include/numpy/logical_not.hpp' adding 'pythran/pythonic/include/numpy/logical_or.hpp' adding 'pythran/pythonic/include/numpy/logical_xor.hpp' adding 'pythran/pythonic/include/numpy/logspace.hpp' adding 'pythran/pythonic/include/numpy/longlong.hpp' adding 'pythran/pythonic/include/numpy/max.hpp' adding 'pythran/pythonic/include/numpy/maximum.hpp' adding 'pythran/pythonic/include/numpy/mean.hpp' adding 'pythran/pythonic/include/numpy/median.hpp' adding 'pythran/pythonic/include/numpy/min.hpp' adding 'pythran/pythonic/include/numpy/minimum.hpp' adding 'pythran/pythonic/include/numpy/mod.hpp' adding 'pythran/pythonic/include/numpy/multiply.hpp' adding 'pythran/pythonic/include/numpy/nan.hpp' adding 'pythran/pythonic/include/numpy/nan_to_num.hpp' adding 'pythran/pythonic/include/numpy/nanargmax.hpp' adding 'pythran/pythonic/include/numpy/nanargmin.hpp' adding 'pythran/pythonic/include/numpy/nanmax.hpp' adding 'pythran/pythonic/include/numpy/nanmin.hpp' adding 'pythran/pythonic/include/numpy/nansum.hpp' adding 'pythran/pythonic/include/numpy/ndarray.hpp' adding 'pythran/pythonic/include/numpy/ndenumerate.hpp' adding 'pythran/pythonic/include/numpy/ndim.hpp' adding 'pythran/pythonic/include/numpy/ndindex.hpp' adding 'pythran/pythonic/include/numpy/negative.hpp' adding 'pythran/pythonic/include/numpy/newaxis.hpp' adding 'pythran/pythonic/include/numpy/nextafter.hpp' adding 'pythran/pythonic/include/numpy/nonzero.hpp' adding 'pythran/pythonic/include/numpy/not_equal.hpp' adding 'pythran/pythonic/include/numpy/ones.hpp' adding 'pythran/pythonic/include/numpy/ones_like.hpp' adding 'pythran/pythonic/include/numpy/outer.hpp' adding 'pythran/pythonic/include/numpy/partial_sum.hpp' adding 'pythran/pythonic/include/numpy/pi.hpp' adding 'pythran/pythonic/include/numpy/place.hpp' adding 'pythran/pythonic/include/numpy/power.hpp' adding 'pythran/pythonic/include/numpy/prod.hpp' adding 'pythran/pythonic/include/numpy/product.hpp' adding 'pythran/pythonic/include/numpy/ptp.hpp' adding 'pythran/pythonic/include/numpy/put.hpp' adding 'pythran/pythonic/include/numpy/putmask.hpp' adding 'pythran/pythonic/include/numpy/rad2deg.hpp' adding 'pythran/pythonic/include/numpy/radians.hpp' adding 'pythran/pythonic/include/numpy/ravel.hpp' adding 'pythran/pythonic/include/numpy/real.hpp' adding 'pythran/pythonic/include/numpy/reciprocal.hpp' adding 'pythran/pythonic/include/numpy/reduce.hpp' adding 'pythran/pythonic/include/numpy/remainder.hpp' adding 'pythran/pythonic/include/numpy/repeat.hpp' adding 'pythran/pythonic/include/numpy/resize.hpp' adding 'pythran/pythonic/include/numpy/right_shift.hpp' adding 'pythran/pythonic/include/numpy/rint.hpp' adding 'pythran/pythonic/include/numpy/roll.hpp' adding 'pythran/pythonic/include/numpy/rollaxis.hpp' adding 'pythran/pythonic/include/numpy/rot90.hpp' adding 'pythran/pythonic/include/numpy/round.hpp' adding 'pythran/pythonic/include/numpy/round_.hpp' adding 'pythran/pythonic/include/numpy/searchsorted.hpp' adding 'pythran/pythonic/include/numpy/select.hpp' adding 'pythran/pythonic/include/numpy/setdiff1d.hpp' adding 'pythran/pythonic/include/numpy/shape.hpp' adding 'pythran/pythonic/include/numpy/short_.hpp' adding 'pythran/pythonic/include/numpy/sign.hpp' adding 'pythran/pythonic/include/numpy/signbit.hpp' adding 'pythran/pythonic/include/numpy/sin.hpp' adding 'pythran/pythonic/include/numpy/sinh.hpp' adding 'pythran/pythonic/include/numpy/size.hpp' adding 'pythran/pythonic/include/numpy/sometrue.hpp' adding 'pythran/pythonic/include/numpy/sort.hpp' adding 'pythran/pythonic/include/numpy/sort_complex.hpp' adding 'pythran/pythonic/include/numpy/spacing.hpp' adding 'pythran/pythonic/include/numpy/split.hpp' adding 'pythran/pythonic/include/numpy/sqrt.hpp' adding 'pythran/pythonic/include/numpy/square.hpp' adding 'pythran/pythonic/include/numpy/stack.hpp' adding 'pythran/pythonic/include/numpy/std_.hpp' adding 'pythran/pythonic/include/numpy/subtract.hpp' adding 'pythran/pythonic/include/numpy/sum.hpp' adding 'pythran/pythonic/include/numpy/swapaxes.hpp' adding 'pythran/pythonic/include/numpy/take.hpp' adding 'pythran/pythonic/include/numpy/tan.hpp' adding 'pythran/pythonic/include/numpy/tanh.hpp' adding 'pythran/pythonic/include/numpy/tile.hpp' adding 'pythran/pythonic/include/numpy/trace.hpp' adding 'pythran/pythonic/include/numpy/transpose.hpp' adding 'pythran/pythonic/include/numpy/tri.hpp' adding 'pythran/pythonic/include/numpy/tril.hpp' adding 'pythran/pythonic/include/numpy/trim_zeros.hpp' adding 'pythran/pythonic/include/numpy/triu.hpp' adding 'pythran/pythonic/include/numpy/true_divide.hpp' adding 'pythran/pythonic/include/numpy/trunc.hpp' adding 'pythran/pythonic/include/numpy/ubyte.hpp' adding 'pythran/pythonic/include/numpy/ufunc_accumulate.hpp' adding 'pythran/pythonic/include/numpy/ufunc_reduce.hpp' adding 'pythran/pythonic/include/numpy/uint.hpp' adding 'pythran/pythonic/include/numpy/uint16.hpp' adding 'pythran/pythonic/include/numpy/uint32.hpp' adding 'pythran/pythonic/include/numpy/uint64.hpp' adding 'pythran/pythonic/include/numpy/uint8.hpp' adding 'pythran/pythonic/include/numpy/uintc.hpp' adding 'pythran/pythonic/include/numpy/uintp.hpp' adding 'pythran/pythonic/include/numpy/ulonglong.hpp' adding 'pythran/pythonic/include/numpy/union1d.hpp' adding 'pythran/pythonic/include/numpy/unique.hpp' adding 'pythran/pythonic/include/numpy/unravel_index.hpp' adding 'pythran/pythonic/include/numpy/unwrap.hpp' adding 'pythran/pythonic/include/numpy/ushort.hpp' adding 'pythran/pythonic/include/numpy/var.hpp' adding 'pythran/pythonic/include/numpy/vdot.hpp' adding 'pythran/pythonic/include/numpy/vstack.hpp' adding 'pythran/pythonic/include/numpy/where.hpp' adding 'pythran/pythonic/include/numpy/zeros.hpp' adding 'pythran/pythonic/include/numpy/zeros_like.hpp' adding 'pythran/pythonic/include/numpy/add/accumulate.hpp' adding 'pythran/pythonic/include/numpy/add/reduce.hpp' adding 'pythran/pythonic/include/numpy/arctan2/accumulate.hpp' adding 'pythran/pythonic/include/numpy/bitwise_and/accumulate.hpp' adding 'pythran/pythonic/include/numpy/bitwise_and/reduce.hpp' adding 'pythran/pythonic/include/numpy/bitwise_or/accumulate.hpp' adding 'pythran/pythonic/include/numpy/bitwise_or/reduce.hpp' adding 'pythran/pythonic/include/numpy/bitwise_xor/accumulate.hpp' adding 'pythran/pythonic/include/numpy/bitwise_xor/reduce.hpp' adding 'pythran/pythonic/include/numpy/copysign/accumulate.hpp' adding 'pythran/pythonic/include/numpy/ctypeslib/as_array.hpp' adding 'pythran/pythonic/include/numpy/divide/accumulate.hpp' adding 'pythran/pythonic/include/numpy/dtype/type.hpp' adding 'pythran/pythonic/include/numpy/equal/accumulate.hpp' adding 'pythran/pythonic/include/numpy/fft/c2c.hpp' adding 'pythran/pythonic/include/numpy/fft/fft.hpp' adding 'pythran/pythonic/include/numpy/fft/hfft.hpp' adding 'pythran/pythonic/include/numpy/fft/ifft.hpp' adding 'pythran/pythonic/include/numpy/fft/ihfft.hpp' adding 'pythran/pythonic/include/numpy/fft/irfft.hpp' adding 'pythran/pythonic/include/numpy/fft/rfft.hpp' adding 'pythran/pythonic/include/numpy/floor_divide/accumulate.hpp' adding 'pythran/pythonic/include/numpy/fmax/accumulate.hpp' adding 'pythran/pythonic/include/numpy/fmax/reduce.hpp' adding 'pythran/pythonic/include/numpy/fmin/accumulate.hpp' adding 'pythran/pythonic/include/numpy/fmin/reduce.hpp' adding 'pythran/pythonic/include/numpy/fmod/accumulate.hpp' adding 'pythran/pythonic/include/numpy/greater/accumulate.hpp' adding 'pythran/pythonic/include/numpy/greater_equal/accumulate.hpp' adding 'pythran/pythonic/include/numpy/heaviside/accumulate.hpp' adding 'pythran/pythonic/include/numpy/hypot/accumulate.hpp' adding 'pythran/pythonic/include/numpy/ldexp/accumulate.hpp' adding 'pythran/pythonic/include/numpy/left_shift/accumulate.hpp' adding 'pythran/pythonic/include/numpy/less/accumulate.hpp' adding 'pythran/pythonic/include/numpy/less_equal/accumulate.hpp' adding 'pythran/pythonic/include/numpy/linalg/matrix_power.hpp' adding 'pythran/pythonic/include/numpy/linalg/norm.hpp' adding 'pythran/pythonic/include/numpy/logaddexp/accumulate.hpp' adding 'pythran/pythonic/include/numpy/logaddexp2/accumulate.hpp' adding 'pythran/pythonic/include/numpy/logical_and/accumulate.hpp' adding 'pythran/pythonic/include/numpy/logical_or/accumulate.hpp' adding 'pythran/pythonic/include/numpy/logical_xor/accumulate.hpp' adding 'pythran/pythonic/include/numpy/maximum/accumulate.hpp' adding 'pythran/pythonic/include/numpy/maximum/reduce.hpp' adding 'pythran/pythonic/include/numpy/minimum/accumulate.hpp' adding 'pythran/pythonic/include/numpy/minimum/reduce.hpp' adding 'pythran/pythonic/include/numpy/mod/accumulate.hpp' adding 'pythran/pythonic/include/numpy/multiply/accumulate.hpp' adding 'pythran/pythonic/include/numpy/multiply/reduce.hpp' adding 'pythran/pythonic/include/numpy/ndarray/astype.hpp' adding 'pythran/pythonic/include/numpy/ndarray/fill.hpp' adding 'pythran/pythonic/include/numpy/ndarray/flatten.hpp' adding 'pythran/pythonic/include/numpy/ndarray/item.hpp' adding 'pythran/pythonic/include/numpy/ndarray/reshape.hpp' adding 'pythran/pythonic/include/numpy/ndarray/sort.hpp' adding 'pythran/pythonic/include/numpy/ndarray/tofile.hpp' adding 'pythran/pythonic/include/numpy/ndarray/tolist.hpp' adding 'pythran/pythonic/include/numpy/ndarray/tostring.hpp' adding 'pythran/pythonic/include/numpy/negative/accumulate.hpp' adding 'pythran/pythonic/include/numpy/nextafter/accumulate.hpp' adding 'pythran/pythonic/include/numpy/not_equal/accumulate.hpp' adding 'pythran/pythonic/include/numpy/power/accumulate.hpp' adding 'pythran/pythonic/include/numpy/random/binomial.hpp' adding 'pythran/pythonic/include/numpy/random/bytes.hpp' adding 'pythran/pythonic/include/numpy/random/chisquare.hpp' adding 'pythran/pythonic/include/numpy/random/choice.hpp' adding 'pythran/pythonic/include/numpy/random/dirichlet.hpp' adding 'pythran/pythonic/include/numpy/random/exponential.hpp' adding 'pythran/pythonic/include/numpy/random/f.hpp' adding 'pythran/pythonic/include/numpy/random/gamma.hpp' adding 'pythran/pythonic/include/numpy/random/generator.hpp' adding 'pythran/pythonic/include/numpy/random/geometric.hpp' adding 'pythran/pythonic/include/numpy/random/gumbel.hpp' adding 'pythran/pythonic/include/numpy/random/laplace.hpp' adding 'pythran/pythonic/include/numpy/random/logistic.hpp' adding 'pythran/pythonic/include/numpy/random/lognormal.hpp' adding 'pythran/pythonic/include/numpy/random/logseries.hpp' adding 'pythran/pythonic/include/numpy/random/negative_binomial.hpp' adding 'pythran/pythonic/include/numpy/random/normal.hpp' adding 'pythran/pythonic/include/numpy/random/pareto.hpp' adding 'pythran/pythonic/include/numpy/random/poisson.hpp' adding 'pythran/pythonic/include/numpy/random/power.hpp' adding 'pythran/pythonic/include/numpy/random/rand.hpp' adding 'pythran/pythonic/include/numpy/random/randint.hpp' adding 'pythran/pythonic/include/numpy/random/randn.hpp' adding 'pythran/pythonic/include/numpy/random/random.hpp' adding 'pythran/pythonic/include/numpy/random/random_integers.hpp' adding 'pythran/pythonic/include/numpy/random/random_sample.hpp' adding 'pythran/pythonic/include/numpy/random/ranf.hpp' adding 'pythran/pythonic/include/numpy/random/rayleigh.hpp' adding 'pythran/pythonic/include/numpy/random/sample.hpp' adding 'pythran/pythonic/include/numpy/random/seed.hpp' adding 'pythran/pythonic/include/numpy/random/shuffle.hpp' adding 'pythran/pythonic/include/numpy/random/standard_exponential.hpp' adding 'pythran/pythonic/include/numpy/random/standard_gamma.hpp' adding 'pythran/pythonic/include/numpy/random/standard_normal.hpp' adding 'pythran/pythonic/include/numpy/random/uniform.hpp' adding 'pythran/pythonic/include/numpy/random/weibull.hpp' adding 'pythran/pythonic/include/numpy/remainder/accumulate.hpp' adding 'pythran/pythonic/include/numpy/right_shift/accumulate.hpp' adding 'pythran/pythonic/include/numpy/subtract/accumulate.hpp' adding 'pythran/pythonic/include/numpy/true_divide/accumulate.hpp' adding 'pythran/pythonic/include/omp/get_num_threads.hpp' adding 'pythran/pythonic/include/omp/get_thread_num.hpp' adding 'pythran/pythonic/include/omp/get_wtick.hpp' adding 'pythran/pythonic/include/omp/get_wtime.hpp' adding 'pythran/pythonic/include/omp/in_parallel.hpp' adding 'pythran/pythonic/include/omp/set_nested.hpp' adding 'pythran/pythonic/include/operator_/__abs__.hpp' adding 'pythran/pythonic/include/operator_/__add__.hpp' adding 'pythran/pythonic/include/operator_/__and__.hpp' adding 'pythran/pythonic/include/operator_/__concat__.hpp' adding 'pythran/pythonic/include/operator_/__contains__.hpp' adding 'pythran/pythonic/include/operator_/__delitem__.hpp' adding 'pythran/pythonic/include/operator_/__div__.hpp' adding 'pythran/pythonic/include/operator_/__eq__.hpp' adding 'pythran/pythonic/include/operator_/__floordiv__.hpp' adding 'pythran/pythonic/include/operator_/__ge__.hpp' adding 'pythran/pythonic/include/operator_/__getitem__.hpp' adding 'pythran/pythonic/include/operator_/__gt__.hpp' adding 'pythran/pythonic/include/operator_/__iadd__.hpp' adding 'pythran/pythonic/include/operator_/__iand__.hpp' adding 'pythran/pythonic/include/operator_/__iconcat__.hpp' adding 'pythran/pythonic/include/operator_/__idiv__.hpp' adding 'pythran/pythonic/include/operator_/__ifloordiv__.hpp' adding 'pythran/pythonic/include/operator_/__ilshift__.hpp' adding 'pythran/pythonic/include/operator_/__imod__.hpp' adding 'pythran/pythonic/include/operator_/__imul__.hpp' adding 'pythran/pythonic/include/operator_/__inv__.hpp' adding 'pythran/pythonic/include/operator_/__invert__.hpp' adding 'pythran/pythonic/include/operator_/__ior__.hpp' adding 'pythran/pythonic/include/operator_/__ipow__.hpp' adding 'pythran/pythonic/include/operator_/__irshift__.hpp' adding 'pythran/pythonic/include/operator_/__isub__.hpp' adding 'pythran/pythonic/include/operator_/__itruediv__.hpp' adding 'pythran/pythonic/include/operator_/__ixor__.hpp' adding 'pythran/pythonic/include/operator_/__le__.hpp' adding 'pythran/pythonic/include/operator_/__lshift__.hpp' adding 'pythran/pythonic/include/operator_/__lt__.hpp' adding 'pythran/pythonic/include/operator_/__matmul__.hpp' adding 'pythran/pythonic/include/operator_/__mod__.hpp' adding 'pythran/pythonic/include/operator_/__mul__.hpp' adding 'pythran/pythonic/include/operator_/__ne__.hpp' adding 'pythran/pythonic/include/operator_/__neg__.hpp' adding 'pythran/pythonic/include/operator_/__not__.hpp' adding 'pythran/pythonic/include/operator_/__or__.hpp' adding 'pythran/pythonic/include/operator_/__pos__.hpp' adding 'pythran/pythonic/include/operator_/__rshift__.hpp' adding 'pythran/pythonic/include/operator_/__sub__.hpp' adding 'pythran/pythonic/include/operator_/__truediv__.hpp' adding 'pythran/pythonic/include/operator_/__xor__.hpp' adding 'pythran/pythonic/include/operator_/abs.hpp' adding 'pythran/pythonic/include/operator_/add.hpp' adding 'pythran/pythonic/include/operator_/and_.hpp' adding 'pythran/pythonic/include/operator_/concat.hpp' adding 'pythran/pythonic/include/operator_/contains.hpp' adding 'pythran/pythonic/include/operator_/countOf.hpp' adding 'pythran/pythonic/include/operator_/delitem.hpp' adding 'pythran/pythonic/include/operator_/div.hpp' adding 'pythran/pythonic/include/operator_/eq.hpp' adding 'pythran/pythonic/include/operator_/floordiv.hpp' adding 'pythran/pythonic/include/operator_/ge.hpp' adding 'pythran/pythonic/include/operator_/getitem.hpp' adding 'pythran/pythonic/include/operator_/gt.hpp' adding 'pythran/pythonic/include/operator_/iadd.hpp' adding 'pythran/pythonic/include/operator_/iand.hpp' adding 'pythran/pythonic/include/operator_/icommon.hpp' adding 'pythran/pythonic/include/operator_/iconcat.hpp' adding 'pythran/pythonic/include/operator_/idiv.hpp' adding 'pythran/pythonic/include/operator_/ifloordiv.hpp' adding 'pythran/pythonic/include/operator_/ilshift.hpp' adding 'pythran/pythonic/include/operator_/imatmul.hpp' adding 'pythran/pythonic/include/operator_/imax.hpp' adding 'pythran/pythonic/include/operator_/imin.hpp' adding 'pythran/pythonic/include/operator_/imod.hpp' adding 'pythran/pythonic/include/operator_/imul.hpp' adding 'pythran/pythonic/include/operator_/indexOf.hpp' adding 'pythran/pythonic/include/operator_/inv.hpp' adding 'pythran/pythonic/include/operator_/invert.hpp' adding 'pythran/pythonic/include/operator_/ior.hpp' adding 'pythran/pythonic/include/operator_/ipow.hpp' adding 'pythran/pythonic/include/operator_/irshift.hpp' adding 'pythran/pythonic/include/operator_/is_.hpp' adding 'pythran/pythonic/include/operator_/is_not.hpp' adding 'pythran/pythonic/include/operator_/isub.hpp' adding 'pythran/pythonic/include/operator_/itemgetter.hpp' adding 'pythran/pythonic/include/operator_/itruediv.hpp' adding 'pythran/pythonic/include/operator_/ixor.hpp' adding 'pythran/pythonic/include/operator_/le.hpp' adding 'pythran/pythonic/include/operator_/lshift.hpp' adding 'pythran/pythonic/include/operator_/lt.hpp' adding 'pythran/pythonic/include/operator_/matmul.hpp' adding 'pythran/pythonic/include/operator_/mod.hpp' adding 'pythran/pythonic/include/operator_/mul.hpp' adding 'pythran/pythonic/include/operator_/ne.hpp' adding 'pythran/pythonic/include/operator_/neg.hpp' adding 'pythran/pythonic/include/operator_/not_.hpp' adding 'pythran/pythonic/include/operator_/or_.hpp' adding 'pythran/pythonic/include/operator_/overloads.hpp' adding 'pythran/pythonic/include/operator_/pos.hpp' adding 'pythran/pythonic/include/operator_/pow.hpp' adding 'pythran/pythonic/include/operator_/rshift.hpp' adding 'pythran/pythonic/include/operator_/sub.hpp' adding 'pythran/pythonic/include/operator_/truediv.hpp' adding 'pythran/pythonic/include/operator_/truth.hpp' adding 'pythran/pythonic/include/operator_/xor_.hpp' adding 'pythran/pythonic/include/os/path/join.hpp' adding 'pythran/pythonic/include/random/choice.hpp' adding 'pythran/pythonic/include/random/expovariate.hpp' adding 'pythran/pythonic/include/random/gauss.hpp' adding 'pythran/pythonic/include/random/randint.hpp' adding 'pythran/pythonic/include/random/random.hpp' adding 'pythran/pythonic/include/random/randrange.hpp' adding 'pythran/pythonic/include/random/sample.hpp' adding 'pythran/pythonic/include/random/seed.hpp' adding 'pythran/pythonic/include/random/shuffle.hpp' adding 'pythran/pythonic/include/random/uniform.hpp' adding 'pythran/pythonic/include/scipy/special/binom.hpp' adding 'pythran/pythonic/include/scipy/special/gamma.hpp' adding 'pythran/pythonic/include/scipy/special/gammaln.hpp' adding 'pythran/pythonic/include/scipy/special/hankel1.hpp' adding 'pythran/pythonic/include/scipy/special/hankel2.hpp' adding 'pythran/pythonic/include/scipy/special/i0.hpp' adding 'pythran/pythonic/include/scipy/special/i0e.hpp' adding 'pythran/pythonic/include/scipy/special/iv.hpp' adding 'pythran/pythonic/include/scipy/special/ivp.hpp' adding 'pythran/pythonic/include/scipy/special/jv.hpp' adding 'pythran/pythonic/include/scipy/special/jvp.hpp' adding 'pythran/pythonic/include/scipy/special/kv.hpp' adding 'pythran/pythonic/include/scipy/special/kvp.hpp' adding 'pythran/pythonic/include/scipy/special/spherical_jn.hpp' adding 'pythran/pythonic/include/scipy/special/spherical_yn.hpp' adding 'pythran/pythonic/include/scipy/special/yv.hpp' adding 'pythran/pythonic/include/scipy/special/yvp.hpp' adding 'pythran/pythonic/include/string/ascii_letters.hpp' adding 'pythran/pythonic/include/string/ascii_lowercase.hpp' adding 'pythran/pythonic/include/string/ascii_uppercase.hpp' adding 'pythran/pythonic/include/string/digits.hpp' adding 'pythran/pythonic/include/string/find.hpp' adding 'pythran/pythonic/include/string/hexdigits.hpp' adding 'pythran/pythonic/include/string/octdigits.hpp' adding 'pythran/pythonic/include/time/sleep.hpp' adding 'pythran/pythonic/include/time/time.hpp' adding 'pythran/pythonic/include/types/NoneType.hpp' adding 'pythran/pythonic/include/types/assignable.hpp' adding 'pythran/pythonic/include/types/attr.hpp' adding 'pythran/pythonic/include/types/bool.hpp' adding 'pythran/pythonic/include/types/cfun.hpp' adding 'pythran/pythonic/include/types/combined.hpp' adding 'pythran/pythonic/include/types/complex.hpp' adding 'pythran/pythonic/include/types/complex128.hpp' adding 'pythran/pythonic/include/types/complex256.hpp' adding 'pythran/pythonic/include/types/complex64.hpp' adding 'pythran/pythonic/include/types/dict.hpp' adding 'pythran/pythonic/include/types/dynamic_tuple.hpp' adding 'pythran/pythonic/include/types/empty_iterator.hpp' adding 'pythran/pythonic/include/types/exceptions.hpp' adding 'pythran/pythonic/include/types/file.hpp' adding 'pythran/pythonic/include/types/finfo.hpp' adding 'pythran/pythonic/include/types/float.hpp' adding 'pythran/pythonic/include/types/float128.hpp' adding 'pythran/pythonic/include/types/float32.hpp' adding 'pythran/pythonic/include/types/float64.hpp' adding 'pythran/pythonic/include/types/generator.hpp' adding 'pythran/pythonic/include/types/immediate.hpp' adding 'pythran/pythonic/include/types/int.hpp' adding 'pythran/pythonic/include/types/int16.hpp' adding 'pythran/pythonic/include/types/int32.hpp' adding 'pythran/pythonic/include/types/int64.hpp' adding 'pythran/pythonic/include/types/int8.hpp' adding 'pythran/pythonic/include/types/intc.hpp' adding 'pythran/pythonic/include/types/intp.hpp' adding 'pythran/pythonic/include/types/lazy.hpp' adding 'pythran/pythonic/include/types/list.hpp' adding 'pythran/pythonic/include/types/ndarray.hpp' adding 'pythran/pythonic/include/types/nditerator.hpp' adding 'pythran/pythonic/include/types/numpy_binary_op.hpp' adding 'pythran/pythonic/include/types/numpy_broadcast.hpp' adding 'pythran/pythonic/include/types/numpy_expr.hpp' adding 'pythran/pythonic/include/types/numpy_gexpr.hpp' adding 'pythran/pythonic/include/types/numpy_iexpr.hpp' adding 'pythran/pythonic/include/types/numpy_nary_expr.hpp' adding 'pythran/pythonic/include/types/numpy_op_helper.hpp' adding 'pythran/pythonic/include/types/numpy_operators.hpp' adding 'pythran/pythonic/include/types/numpy_texpr.hpp' adding 'pythran/pythonic/include/types/numpy_unary_op.hpp' adding 'pythran/pythonic/include/types/numpy_vexpr.hpp' adding 'pythran/pythonic/include/types/pointer.hpp' adding 'pythran/pythonic/include/types/raw_array.hpp' adding 'pythran/pythonic/include/types/set.hpp' adding 'pythran/pythonic/include/types/slice.hpp' adding 'pythran/pythonic/include/types/static_if.hpp' adding 'pythran/pythonic/include/types/str.hpp' adding 'pythran/pythonic/include/types/traits.hpp' adding 'pythran/pythonic/include/types/tuple.hpp' adding 'pythran/pythonic/include/types/uint16.hpp' adding 'pythran/pythonic/include/types/uint32.hpp' adding 'pythran/pythonic/include/types/uint64.hpp' adding 'pythran/pythonic/include/types/uint8.hpp' adding 'pythran/pythonic/include/types/uintc.hpp' adding 'pythran/pythonic/include/types/uintp.hpp' adding 'pythran/pythonic/include/types/variant_functor.hpp' adding 'pythran/pythonic/include/types/vectorizable_type.hpp' adding 'pythran/pythonic/include/utils/array_helper.hpp' adding 'pythran/pythonic/include/utils/broadcast_copy.hpp' adding 'pythran/pythonic/include/utils/functor.hpp' adding 'pythran/pythonic/include/utils/fwd.hpp' adding 'pythran/pythonic/include/utils/int_.hpp' adding 'pythran/pythonic/include/utils/iterator.hpp' adding 'pythran/pythonic/include/utils/meta.hpp' adding 'pythran/pythonic/include/utils/nested_container.hpp' adding 'pythran/pythonic/include/utils/neutral.hpp' adding 'pythran/pythonic/include/utils/numpy_conversion.hpp' adding 'pythran/pythonic/include/utils/numpy_traits.hpp' adding 'pythran/pythonic/include/utils/reserve.hpp' adding 'pythran/pythonic/include/utils/seq.hpp' adding 'pythran/pythonic/include/utils/shared_ref.hpp' adding 'pythran/pythonic/include/utils/tags.hpp' adding 'pythran/pythonic/include/utils/yield.hpp' adding 'pythran/pythonic/io/_io/TextIOWrapper/close.hpp' adding 'pythran/pythonic/io/_io/TextIOWrapper/fileno.hpp' adding 'pythran/pythonic/io/_io/TextIOWrapper/flush.hpp' adding 'pythran/pythonic/io/_io/TextIOWrapper/isatty.hpp' adding 'pythran/pythonic/io/_io/TextIOWrapper/next.hpp' adding 'pythran/pythonic/io/_io/TextIOWrapper/read.hpp' adding 'pythran/pythonic/io/_io/TextIOWrapper/readline.hpp' adding 'pythran/pythonic/io/_io/TextIOWrapper/readlines.hpp' adding 'pythran/pythonic/io/_io/TextIOWrapper/seek.hpp' adding 'pythran/pythonic/io/_io/TextIOWrapper/tell.hpp' adding 'pythran/pythonic/io/_io/TextIOWrapper/truncate.hpp' adding 'pythran/pythonic/io/_io/TextIOWrapper/write.hpp' adding 'pythran/pythonic/io/_io/TextIOWrapper/writelines.hpp' adding 'pythran/pythonic/itertools/combinations.hpp' adding 'pythran/pythonic/itertools/common.hpp' adding 'pythran/pythonic/itertools/count.hpp' adding 'pythran/pythonic/itertools/ifilter.hpp' adding 'pythran/pythonic/itertools/islice.hpp' adding 'pythran/pythonic/itertools/permutations.hpp' adding 'pythran/pythonic/itertools/product.hpp' adding 'pythran/pythonic/itertools/repeat.hpp' adding 'pythran/pythonic/math/acos.hpp' adding 'pythran/pythonic/math/acosh.hpp' adding 'pythran/pythonic/math/asin.hpp' adding 'pythran/pythonic/math/asinh.hpp' adding 'pythran/pythonic/math/atan.hpp' adding 'pythran/pythonic/math/atan2.hpp' adding 'pythran/pythonic/math/atanh.hpp' adding 'pythran/pythonic/math/ceil.hpp' adding 'pythran/pythonic/math/copysign.hpp' adding 'pythran/pythonic/math/cos.hpp' adding 'pythran/pythonic/math/cosh.hpp' adding 'pythran/pythonic/math/degrees.hpp' adding 'pythran/pythonic/math/e.hpp' adding 'pythran/pythonic/math/erf.hpp' adding 'pythran/pythonic/math/erfc.hpp' adding 'pythran/pythonic/math/exp.hpp' adding 'pythran/pythonic/math/expm1.hpp' adding 'pythran/pythonic/math/fabs.hpp' adding 'pythran/pythonic/math/factorial.hpp' adding 'pythran/pythonic/math/floor.hpp' adding 'pythran/pythonic/math/fmod.hpp' adding 'pythran/pythonic/math/frexp.hpp' adding 'pythran/pythonic/math/gamma.hpp' adding 'pythran/pythonic/math/hypot.hpp' adding 'pythran/pythonic/math/isinf.hpp' adding 'pythran/pythonic/math/isnan.hpp' adding 'pythran/pythonic/math/ldexp.hpp' adding 'pythran/pythonic/math/lgamma.hpp' adding 'pythran/pythonic/math/log.hpp' adding 'pythran/pythonic/math/log10.hpp' adding 'pythran/pythonic/math/log1p.hpp' adding 'pythran/pythonic/math/modf.hpp' adding 'pythran/pythonic/math/pi.hpp' adding 'pythran/pythonic/math/pow.hpp' adding 'pythran/pythonic/math/radians.hpp' adding 'pythran/pythonic/math/sin.hpp' adding 'pythran/pythonic/math/sinh.hpp' adding 'pythran/pythonic/math/sqrt.hpp' adding 'pythran/pythonic/math/tan.hpp' adding 'pythran/pythonic/math/tanh.hpp' adding 'pythran/pythonic/math/trunc.hpp' adding 'pythran/pythonic/numpy/NINF.hpp' adding 'pythran/pythonic/numpy/abs.hpp' adding 'pythran/pythonic/numpy/absolute.hpp' adding 'pythran/pythonic/numpy/add.hpp' adding 'pythran/pythonic/numpy/alen.hpp' adding 'pythran/pythonic/numpy/all.hpp' adding 'pythran/pythonic/numpy/allclose.hpp' adding 'pythran/pythonic/numpy/alltrue.hpp' adding 'pythran/pythonic/numpy/amax.hpp' adding 'pythran/pythonic/numpy/amin.hpp' adding 'pythran/pythonic/numpy/angle.hpp' adding 'pythran/pythonic/numpy/angle_in_deg.hpp' adding 'pythran/pythonic/numpy/angle_in_rad.hpp' adding 'pythran/pythonic/numpy/any.hpp' adding 'pythran/pythonic/numpy/append.hpp' adding 'pythran/pythonic/numpy/arange.hpp' adding 'pythran/pythonic/numpy/arccos.hpp' adding 'pythran/pythonic/numpy/arccosh.hpp' adding 'pythran/pythonic/numpy/arcsin.hpp' adding 'pythran/pythonic/numpy/arcsinh.hpp' adding 'pythran/pythonic/numpy/arctan.hpp' adding 'pythran/pythonic/numpy/arctan2.hpp' adding 'pythran/pythonic/numpy/arctanh.hpp' adding 'pythran/pythonic/numpy/argmax.hpp' adding 'pythran/pythonic/numpy/argmin.hpp' adding 'pythran/pythonic/numpy/argminmax.hpp' adding 'pythran/pythonic/numpy/argsort.hpp' adding 'pythran/pythonic/numpy/argwhere.hpp' adding 'pythran/pythonic/numpy/around.hpp' adding 'pythran/pythonic/numpy/array.hpp' adding 'pythran/pythonic/numpy/array2string.hpp' adding 'pythran/pythonic/numpy/array_equal.hpp' adding 'pythran/pythonic/numpy/array_equiv.hpp' adding 'pythran/pythonic/numpy/array_split.hpp' adding 'pythran/pythonic/numpy/array_str.hpp' adding 'pythran/pythonic/numpy/asarray.hpp' adding 'pythran/pythonic/numpy/asarray_chkfinite.hpp' adding 'pythran/pythonic/numpy/ascontiguousarray.hpp' adding 'pythran/pythonic/numpy/asfarray.hpp' adding 'pythran/pythonic/numpy/asscalar.hpp' adding 'pythran/pythonic/numpy/atleast_1d.hpp' adding 'pythran/pythonic/numpy/atleast_2d.hpp' adding 'pythran/pythonic/numpy/atleast_3d.hpp' adding 'pythran/pythonic/numpy/average.hpp' adding 'pythran/pythonic/numpy/base_repr.hpp' adding 'pythran/pythonic/numpy/binary_repr.hpp' adding 'pythran/pythonic/numpy/bincount.hpp' adding 'pythran/pythonic/numpy/bitwise_and.hpp' adding 'pythran/pythonic/numpy/bitwise_not.hpp' adding 'pythran/pythonic/numpy/bitwise_or.hpp' adding 'pythran/pythonic/numpy/bitwise_xor.hpp' adding 'pythran/pythonic/numpy/bool_.hpp' adding 'pythran/pythonic/numpy/broadcast_to.hpp' adding 'pythran/pythonic/numpy/byte.hpp' adding 'pythran/pythonic/numpy/cbrt.hpp' adding 'pythran/pythonic/numpy/ceil.hpp' adding 'pythran/pythonic/numpy/clip.hpp' adding 'pythran/pythonic/numpy/complex.hpp' adding 'pythran/pythonic/numpy/complex128.hpp' adding 'pythran/pythonic/numpy/complex256.hpp' adding 'pythran/pythonic/numpy/complex64.hpp' adding 'pythran/pythonic/numpy/concatenate.hpp' adding 'pythran/pythonic/numpy/conj.hpp' adding 'pythran/pythonic/numpy/conjugate.hpp' adding 'pythran/pythonic/numpy/convolve.hpp' adding 'pythran/pythonic/numpy/copy.hpp' adding 'pythran/pythonic/numpy/copysign.hpp' adding 'pythran/pythonic/numpy/copyto.hpp' adding 'pythran/pythonic/numpy/correlate.hpp' adding 'pythran/pythonic/numpy/cos.hpp' adding 'pythran/pythonic/numpy/cosh.hpp' adding 'pythran/pythonic/numpy/count_nonzero.hpp' adding 'pythran/pythonic/numpy/cross.hpp' adding 'pythran/pythonic/numpy/cumprod.hpp' adding 'pythran/pythonic/numpy/cumproduct.hpp' adding 'pythran/pythonic/numpy/cumsum.hpp' adding 'pythran/pythonic/numpy/deg2rad.hpp' adding 'pythran/pythonic/numpy/degrees.hpp' adding 'pythran/pythonic/numpy/delete_.hpp' adding 'pythran/pythonic/numpy/diag.hpp' adding 'pythran/pythonic/numpy/diagflat.hpp' adding 'pythran/pythonic/numpy/diagonal.hpp' adding 'pythran/pythonic/numpy/diff.hpp' adding 'pythran/pythonic/numpy/digitize.hpp' adding 'pythran/pythonic/numpy/divide.hpp' adding 'pythran/pythonic/numpy/dot.hpp' adding 'pythran/pythonic/numpy/double_.hpp' adding 'pythran/pythonic/numpy/e.hpp' adding 'pythran/pythonic/numpy/ediff1d.hpp' adding 'pythran/pythonic/numpy/empty.hpp' adding 'pythran/pythonic/numpy/empty_like.hpp' adding 'pythran/pythonic/numpy/equal.hpp' adding 'pythran/pythonic/numpy/exp.hpp' adding 'pythran/pythonic/numpy/expand_dims.hpp' adding 'pythran/pythonic/numpy/expm1.hpp' adding 'pythran/pythonic/numpy/eye.hpp' adding 'pythran/pythonic/numpy/fabs.hpp' adding 'pythran/pythonic/numpy/fill_diagonal.hpp' adding 'pythran/pythonic/numpy/finfo.hpp' adding 'pythran/pythonic/numpy/fix.hpp' adding 'pythran/pythonic/numpy/flatnonzero.hpp' adding 'pythran/pythonic/numpy/flip.hpp' adding 'pythran/pythonic/numpy/fliplr.hpp' adding 'pythran/pythonic/numpy/flipud.hpp' adding 'pythran/pythonic/numpy/float128.hpp' adding 'pythran/pythonic/numpy/float32.hpp' adding 'pythran/pythonic/numpy/float64.hpp' adding 'pythran/pythonic/numpy/float_.hpp' adding 'pythran/pythonic/numpy/floor.hpp' adding 'pythran/pythonic/numpy/floor_divide.hpp' adding 'pythran/pythonic/numpy/fmax.hpp' adding 'pythran/pythonic/numpy/fmin.hpp' adding 'pythran/pythonic/numpy/fmod.hpp' adding 'pythran/pythonic/numpy/frexp.hpp' adding 'pythran/pythonic/numpy/fromfile.hpp' adding 'pythran/pythonic/numpy/fromfunction.hpp' adding 'pythran/pythonic/numpy/fromiter.hpp' adding 'pythran/pythonic/numpy/fromstring.hpp' adding 'pythran/pythonic/numpy/full.hpp' adding 'pythran/pythonic/numpy/full_like.hpp' adding 'pythran/pythonic/numpy/greater.hpp' adding 'pythran/pythonic/numpy/greater_equal.hpp' adding 'pythran/pythonic/numpy/heaviside.hpp' adding 'pythran/pythonic/numpy/hstack.hpp' adding 'pythran/pythonic/numpy/hypot.hpp' adding 'pythran/pythonic/numpy/identity.hpp' adding 'pythran/pythonic/numpy/imag.hpp' adding 'pythran/pythonic/numpy/indices.hpp' adding 'pythran/pythonic/numpy/inf.hpp' adding 'pythran/pythonic/numpy/inner.hpp' adding 'pythran/pythonic/numpy/insert.hpp' adding 'pythran/pythonic/numpy/int16.hpp' adding 'pythran/pythonic/numpy/int32.hpp' adding 'pythran/pythonic/numpy/int64.hpp' adding 'pythran/pythonic/numpy/int8.hpp' adding 'pythran/pythonic/numpy/int_.hpp' adding 'pythran/pythonic/numpy/intc.hpp' adding 'pythran/pythonic/numpy/interp.hpp' adding 'pythran/pythonic/numpy/interp_core.hpp' adding 'pythran/pythonic/numpy/intersect1d.hpp' adding 'pythran/pythonic/numpy/intp.hpp' adding 'pythran/pythonic/numpy/invert.hpp' adding 'pythran/pythonic/numpy/isclose.hpp' adding 'pythran/pythonic/numpy/iscomplex.hpp' adding 'pythran/pythonic/numpy/isfinite.hpp' adding 'pythran/pythonic/numpy/isinf.hpp' adding 'pythran/pythonic/numpy/isnan.hpp' adding 'pythran/pythonic/numpy/isneginf.hpp' adding 'pythran/pythonic/numpy/isposinf.hpp' adding 'pythran/pythonic/numpy/isreal.hpp' adding 'pythran/pythonic/numpy/isrealobj.hpp' adding 'pythran/pythonic/numpy/isscalar.hpp' adding 'pythran/pythonic/numpy/issctype.hpp' adding 'pythran/pythonic/numpy/ldexp.hpp' adding 'pythran/pythonic/numpy/left_shift.hpp' adding 'pythran/pythonic/numpy/less.hpp' adding 'pythran/pythonic/numpy/less_equal.hpp' adding 'pythran/pythonic/numpy/lexsort.hpp' adding 'pythran/pythonic/numpy/linspace.hpp' adding 'pythran/pythonic/numpy/log.hpp' adding 'pythran/pythonic/numpy/log10.hpp' adding 'pythran/pythonic/numpy/log1p.hpp' adding 'pythran/pythonic/numpy/log2.hpp' adding 'pythran/pythonic/numpy/logaddexp.hpp' adding 'pythran/pythonic/numpy/logaddexp2.hpp' adding 'pythran/pythonic/numpy/logical_and.hpp' adding 'pythran/pythonic/numpy/logical_not.hpp' adding 'pythran/pythonic/numpy/logical_or.hpp' adding 'pythran/pythonic/numpy/logical_xor.hpp' adding 'pythran/pythonic/numpy/logspace.hpp' adding 'pythran/pythonic/numpy/longlong.hpp' adding 'pythran/pythonic/numpy/max.hpp' adding 'pythran/pythonic/numpy/maximum.hpp' adding 'pythran/pythonic/numpy/mean.hpp' adding 'pythran/pythonic/numpy/median.hpp' adding 'pythran/pythonic/numpy/min.hpp' adding 'pythran/pythonic/numpy/minimum.hpp' adding 'pythran/pythonic/numpy/mod.hpp' adding 'pythran/pythonic/numpy/multiply.hpp' adding 'pythran/pythonic/numpy/nan.hpp' adding 'pythran/pythonic/numpy/nan_to_num.hpp' adding 'pythran/pythonic/numpy/nanargmax.hpp' adding 'pythran/pythonic/numpy/nanargmin.hpp' adding 'pythran/pythonic/numpy/nanmax.hpp' adding 'pythran/pythonic/numpy/nanmin.hpp' adding 'pythran/pythonic/numpy/nansum.hpp' adding 'pythran/pythonic/numpy/ndarray.hpp' adding 'pythran/pythonic/numpy/ndenumerate.hpp' adding 'pythran/pythonic/numpy/ndim.hpp' adding 'pythran/pythonic/numpy/ndindex.hpp' adding 'pythran/pythonic/numpy/negative.hpp' adding 'pythran/pythonic/numpy/newaxis.hpp' adding 'pythran/pythonic/numpy/nextafter.hpp' adding 'pythran/pythonic/numpy/nonzero.hpp' adding 'pythran/pythonic/numpy/not_equal.hpp' adding 'pythran/pythonic/numpy/ones.hpp' adding 'pythran/pythonic/numpy/ones_like.hpp' adding 'pythran/pythonic/numpy/outer.hpp' adding 'pythran/pythonic/numpy/partial_sum.hpp' adding 'pythran/pythonic/numpy/pi.hpp' adding 'pythran/pythonic/numpy/place.hpp' adding 'pythran/pythonic/numpy/power.hpp' adding 'pythran/pythonic/numpy/prod.hpp' adding 'pythran/pythonic/numpy/product.hpp' adding 'pythran/pythonic/numpy/ptp.hpp' adding 'pythran/pythonic/numpy/put.hpp' adding 'pythran/pythonic/numpy/putmask.hpp' adding 'pythran/pythonic/numpy/rad2deg.hpp' adding 'pythran/pythonic/numpy/radians.hpp' adding 'pythran/pythonic/numpy/ravel.hpp' adding 'pythran/pythonic/numpy/real.hpp' adding 'pythran/pythonic/numpy/reciprocal.hpp' adding 'pythran/pythonic/numpy/reduce.hpp' adding 'pythran/pythonic/numpy/remainder.hpp' adding 'pythran/pythonic/numpy/repeat.hpp' adding 'pythran/pythonic/numpy/resize.hpp' adding 'pythran/pythonic/numpy/right_shift.hpp' adding 'pythran/pythonic/numpy/rint.hpp' adding 'pythran/pythonic/numpy/roll.hpp' adding 'pythran/pythonic/numpy/rollaxis.hpp' adding 'pythran/pythonic/numpy/rot90.hpp' adding 'pythran/pythonic/numpy/round.hpp' adding 'pythran/pythonic/numpy/round_.hpp' adding 'pythran/pythonic/numpy/searchsorted.hpp' adding 'pythran/pythonic/numpy/select.hpp' adding 'pythran/pythonic/numpy/setdiff1d.hpp' adding 'pythran/pythonic/numpy/shape.hpp' adding 'pythran/pythonic/numpy/short_.hpp' adding 'pythran/pythonic/numpy/sign.hpp' adding 'pythran/pythonic/numpy/signbit.hpp' adding 'pythran/pythonic/numpy/sin.hpp' adding 'pythran/pythonic/numpy/sinh.hpp' adding 'pythran/pythonic/numpy/size.hpp' adding 'pythran/pythonic/numpy/sometrue.hpp' adding 'pythran/pythonic/numpy/sort.hpp' adding 'pythran/pythonic/numpy/sort_complex.hpp' adding 'pythran/pythonic/numpy/spacing.hpp' adding 'pythran/pythonic/numpy/split.hpp' adding 'pythran/pythonic/numpy/sqrt.hpp' adding 'pythran/pythonic/numpy/square.hpp' adding 'pythran/pythonic/numpy/stack.hpp' adding 'pythran/pythonic/numpy/std_.hpp' adding 'pythran/pythonic/numpy/subtract.hpp' adding 'pythran/pythonic/numpy/sum.hpp' adding 'pythran/pythonic/numpy/swapaxes.hpp' adding 'pythran/pythonic/numpy/take.hpp' adding 'pythran/pythonic/numpy/tan.hpp' adding 'pythran/pythonic/numpy/tanh.hpp' adding 'pythran/pythonic/numpy/tile.hpp' adding 'pythran/pythonic/numpy/trace.hpp' adding 'pythran/pythonic/numpy/transpose.hpp' adding 'pythran/pythonic/numpy/tri.hpp' adding 'pythran/pythonic/numpy/tril.hpp' adding 'pythran/pythonic/numpy/trim_zeros.hpp' adding 'pythran/pythonic/numpy/triu.hpp' adding 'pythran/pythonic/numpy/true_divide.hpp' adding 'pythran/pythonic/numpy/trunc.hpp' adding 'pythran/pythonic/numpy/ubyte.hpp' adding 'pythran/pythonic/numpy/ufunc_accumulate.hpp' adding 'pythran/pythonic/numpy/ufunc_reduce.hpp' adding 'pythran/pythonic/numpy/uint.hpp' adding 'pythran/pythonic/numpy/uint16.hpp' adding 'pythran/pythonic/numpy/uint32.hpp' adding 'pythran/pythonic/numpy/uint64.hpp' adding 'pythran/pythonic/numpy/uint8.hpp' adding 'pythran/pythonic/numpy/uintc.hpp' adding 'pythran/pythonic/numpy/uintp.hpp' adding 'pythran/pythonic/numpy/ulonglong.hpp' adding 'pythran/pythonic/numpy/union1d.hpp' adding 'pythran/pythonic/numpy/unique.hpp' adding 'pythran/pythonic/numpy/unravel_index.hpp' adding 'pythran/pythonic/numpy/unwrap.hpp' adding 'pythran/pythonic/numpy/ushort.hpp' adding 'pythran/pythonic/numpy/var.hpp' adding 'pythran/pythonic/numpy/vdot.hpp' adding 'pythran/pythonic/numpy/vstack.hpp' adding 'pythran/pythonic/numpy/where.hpp' adding 'pythran/pythonic/numpy/zeros.hpp' adding 'pythran/pythonic/numpy/zeros_like.hpp' adding 'pythran/pythonic/numpy/add/accumulate.hpp' adding 'pythran/pythonic/numpy/add/reduce.hpp' adding 'pythran/pythonic/numpy/arctan2/accumulate.hpp' adding 'pythran/pythonic/numpy/bitwise_and/accumulate.hpp' adding 'pythran/pythonic/numpy/bitwise_and/reduce.hpp' adding 'pythran/pythonic/numpy/bitwise_or/accumulate.hpp' adding 'pythran/pythonic/numpy/bitwise_or/reduce.hpp' adding 'pythran/pythonic/numpy/bitwise_xor/accumulate.hpp' adding 'pythran/pythonic/numpy/bitwise_xor/reduce.hpp' adding 'pythran/pythonic/numpy/copysign/accumulate.hpp' adding 'pythran/pythonic/numpy/ctypeslib/as_array.hpp' adding 'pythran/pythonic/numpy/divide/accumulate.hpp' adding 'pythran/pythonic/numpy/dtype/type.hpp' adding 'pythran/pythonic/numpy/equal/accumulate.hpp' adding 'pythran/pythonic/numpy/fft/c2c.hpp' adding 'pythran/pythonic/numpy/fft/fft.hpp' adding 'pythran/pythonic/numpy/fft/hfft.hpp' adding 'pythran/pythonic/numpy/fft/ifft.hpp' adding 'pythran/pythonic/numpy/fft/ihfft.hpp' adding 'pythran/pythonic/numpy/fft/irfft.hpp' adding 'pythran/pythonic/numpy/fft/pocketfft.hpp' adding 'pythran/pythonic/numpy/fft/rfft.hpp' adding 'pythran/pythonic/numpy/floor_divide/accumulate.hpp' adding 'pythran/pythonic/numpy/fmax/accumulate.hpp' adding 'pythran/pythonic/numpy/fmax/reduce.hpp' adding 'pythran/pythonic/numpy/fmin/accumulate.hpp' adding 'pythran/pythonic/numpy/fmin/reduce.hpp' adding 'pythran/pythonic/numpy/fmod/accumulate.hpp' adding 'pythran/pythonic/numpy/greater/accumulate.hpp' adding 'pythran/pythonic/numpy/greater_equal/accumulate.hpp' adding 'pythran/pythonic/numpy/heaviside/accumulate.hpp' adding 'pythran/pythonic/numpy/hypot/accumulate.hpp' adding 'pythran/pythonic/numpy/ldexp/accumulate.hpp' adding 'pythran/pythonic/numpy/left_shift/accumulate.hpp' adding 'pythran/pythonic/numpy/less/accumulate.hpp' adding 'pythran/pythonic/numpy/less_equal/accumulate.hpp' adding 'pythran/pythonic/numpy/linalg/matrix_power.hpp' adding 'pythran/pythonic/numpy/linalg/norm.hpp' adding 'pythran/pythonic/numpy/logaddexp/accumulate.hpp' adding 'pythran/pythonic/numpy/logaddexp2/accumulate.hpp' adding 'pythran/pythonic/numpy/logical_and/accumulate.hpp' adding 'pythran/pythonic/numpy/logical_or/accumulate.hpp' adding 'pythran/pythonic/numpy/logical_xor/accumulate.hpp' adding 'pythran/pythonic/numpy/maximum/accumulate.hpp' adding 'pythran/pythonic/numpy/maximum/reduce.hpp' adding 'pythran/pythonic/numpy/minimum/accumulate.hpp' adding 'pythran/pythonic/numpy/minimum/reduce.hpp' adding 'pythran/pythonic/numpy/mod/accumulate.hpp' adding 'pythran/pythonic/numpy/multiply/accumulate.hpp' adding 'pythran/pythonic/numpy/multiply/reduce.hpp' adding 'pythran/pythonic/numpy/ndarray/astype.hpp' adding 'pythran/pythonic/numpy/ndarray/fill.hpp' adding 'pythran/pythonic/numpy/ndarray/flatten.hpp' adding 'pythran/pythonic/numpy/ndarray/item.hpp' adding 'pythran/pythonic/numpy/ndarray/reshape.hpp' adding 'pythran/pythonic/numpy/ndarray/sort.hpp' adding 'pythran/pythonic/numpy/ndarray/tofile.hpp' adding 'pythran/pythonic/numpy/ndarray/tolist.hpp' adding 'pythran/pythonic/numpy/ndarray/tostring.hpp' adding 'pythran/pythonic/numpy/negative/accumulate.hpp' adding 'pythran/pythonic/numpy/nextafter/accumulate.hpp' adding 'pythran/pythonic/numpy/not_equal/accumulate.hpp' adding 'pythran/pythonic/numpy/power/accumulate.hpp' adding 'pythran/pythonic/numpy/random/binomial.hpp' adding 'pythran/pythonic/numpy/random/bytes.hpp' adding 'pythran/pythonic/numpy/random/chisquare.hpp' adding 'pythran/pythonic/numpy/random/choice.hpp' adding 'pythran/pythonic/numpy/random/dirichlet.hpp' adding 'pythran/pythonic/numpy/random/exponential.hpp' adding 'pythran/pythonic/numpy/random/f.hpp' adding 'pythran/pythonic/numpy/random/gamma.hpp' adding 'pythran/pythonic/numpy/random/geometric.hpp' adding 'pythran/pythonic/numpy/random/gumbel.hpp' adding 'pythran/pythonic/numpy/random/laplace.hpp' adding 'pythran/pythonic/numpy/random/logistic.hpp' adding 'pythran/pythonic/numpy/random/lognormal.hpp' adding 'pythran/pythonic/numpy/random/logseries.hpp' adding 'pythran/pythonic/numpy/random/negative_binomial.hpp' adding 'pythran/pythonic/numpy/random/normal.hpp' adding 'pythran/pythonic/numpy/random/pareto.hpp' adding 'pythran/pythonic/numpy/random/poisson.hpp' adding 'pythran/pythonic/numpy/random/power.hpp' adding 'pythran/pythonic/numpy/random/rand.hpp' adding 'pythran/pythonic/numpy/random/randint.hpp' adding 'pythran/pythonic/numpy/random/randn.hpp' adding 'pythran/pythonic/numpy/random/random.hpp' adding 'pythran/pythonic/numpy/random/random_integers.hpp' adding 'pythran/pythonic/numpy/random/random_sample.hpp' adding 'pythran/pythonic/numpy/random/ranf.hpp' adding 'pythran/pythonic/numpy/random/rayleigh.hpp' adding 'pythran/pythonic/numpy/random/sample.hpp' adding 'pythran/pythonic/numpy/random/seed.hpp' adding 'pythran/pythonic/numpy/random/shuffle.hpp' adding 'pythran/pythonic/numpy/random/standard_exponential.hpp' adding 'pythran/pythonic/numpy/random/standard_gamma.hpp' adding 'pythran/pythonic/numpy/random/standard_normal.hpp' adding 'pythran/pythonic/numpy/random/uniform.hpp' adding 'pythran/pythonic/numpy/random/weibull.hpp' adding 'pythran/pythonic/numpy/remainder/accumulate.hpp' adding 'pythran/pythonic/numpy/right_shift/accumulate.hpp' adding 'pythran/pythonic/numpy/subtract/accumulate.hpp' adding 'pythran/pythonic/numpy/true_divide/accumulate.hpp' adding 'pythran/pythonic/omp/get_num_threads.hpp' adding 'pythran/pythonic/omp/get_thread_num.hpp' adding 'pythran/pythonic/omp/get_wtick.hpp' adding 'pythran/pythonic/omp/get_wtime.hpp' adding 'pythran/pythonic/omp/in_parallel.hpp' adding 'pythran/pythonic/omp/set_nested.hpp' adding 'pythran/pythonic/operator_/__abs__.hpp' adding 'pythran/pythonic/operator_/__add__.hpp' adding 'pythran/pythonic/operator_/__and__.hpp' adding 'pythran/pythonic/operator_/__concat__.hpp' adding 'pythran/pythonic/operator_/__contains__.hpp' adding 'pythran/pythonic/operator_/__delitem__.hpp' adding 'pythran/pythonic/operator_/__div__.hpp' adding 'pythran/pythonic/operator_/__eq__.hpp' adding 'pythran/pythonic/operator_/__floordiv__.hpp' adding 'pythran/pythonic/operator_/__ge__.hpp' adding 'pythran/pythonic/operator_/__getitem__.hpp' adding 'pythran/pythonic/operator_/__gt__.hpp' adding 'pythran/pythonic/operator_/__iadd__.hpp' adding 'pythran/pythonic/operator_/__iand__.hpp' adding 'pythran/pythonic/operator_/__iconcat__.hpp' adding 'pythran/pythonic/operator_/__idiv__.hpp' adding 'pythran/pythonic/operator_/__ifloordiv__.hpp' adding 'pythran/pythonic/operator_/__ilshift__.hpp' adding 'pythran/pythonic/operator_/__imod__.hpp' adding 'pythran/pythonic/operator_/__imul__.hpp' adding 'pythran/pythonic/operator_/__inv__.hpp' adding 'pythran/pythonic/operator_/__invert__.hpp' adding 'pythran/pythonic/operator_/__ior__.hpp' adding 'pythran/pythonic/operator_/__ipow__.hpp' adding 'pythran/pythonic/operator_/__irshift__.hpp' adding 'pythran/pythonic/operator_/__isub__.hpp' adding 'pythran/pythonic/operator_/__itruediv__.hpp' adding 'pythran/pythonic/operator_/__ixor__.hpp' adding 'pythran/pythonic/operator_/__le__.hpp' adding 'pythran/pythonic/operator_/__lshift__.hpp' adding 'pythran/pythonic/operator_/__lt__.hpp' adding 'pythran/pythonic/operator_/__matmul__.hpp' adding 'pythran/pythonic/operator_/__mod__.hpp' adding 'pythran/pythonic/operator_/__mul__.hpp' adding 'pythran/pythonic/operator_/__ne__.hpp' adding 'pythran/pythonic/operator_/__neg__.hpp' adding 'pythran/pythonic/operator_/__not__.hpp' adding 'pythran/pythonic/operator_/__or__.hpp' adding 'pythran/pythonic/operator_/__pos__.hpp' adding 'pythran/pythonic/operator_/__rshift__.hpp' adding 'pythran/pythonic/operator_/__sub__.hpp' adding 'pythran/pythonic/operator_/__truediv__.hpp' adding 'pythran/pythonic/operator_/__xor__.hpp' adding 'pythran/pythonic/operator_/abs.hpp' adding 'pythran/pythonic/operator_/add.hpp' adding 'pythran/pythonic/operator_/and_.hpp' adding 'pythran/pythonic/operator_/concat.hpp' adding 'pythran/pythonic/operator_/contains.hpp' adding 'pythran/pythonic/operator_/countOf.hpp' adding 'pythran/pythonic/operator_/delitem.hpp' adding 'pythran/pythonic/operator_/div.hpp' adding 'pythran/pythonic/operator_/eq.hpp' adding 'pythran/pythonic/operator_/floordiv.hpp' adding 'pythran/pythonic/operator_/ge.hpp' adding 'pythran/pythonic/operator_/getitem.hpp' adding 'pythran/pythonic/operator_/gt.hpp' adding 'pythran/pythonic/operator_/iadd.hpp' adding 'pythran/pythonic/operator_/iand.hpp' adding 'pythran/pythonic/operator_/icommon.hpp' adding 'pythran/pythonic/operator_/iconcat.hpp' adding 'pythran/pythonic/operator_/idiv.hpp' adding 'pythran/pythonic/operator_/ifloordiv.hpp' adding 'pythran/pythonic/operator_/ilshift.hpp' adding 'pythran/pythonic/operator_/imatmul.hpp' adding 'pythran/pythonic/operator_/imax.hpp' adding 'pythran/pythonic/operator_/imin.hpp' adding 'pythran/pythonic/operator_/imod.hpp' adding 'pythran/pythonic/operator_/imul.hpp' adding 'pythran/pythonic/operator_/indexOf.hpp' adding 'pythran/pythonic/operator_/inv.hpp' adding 'pythran/pythonic/operator_/invert.hpp' adding 'pythran/pythonic/operator_/ior.hpp' adding 'pythran/pythonic/operator_/ipow.hpp' adding 'pythran/pythonic/operator_/irshift.hpp' adding 'pythran/pythonic/operator_/is_.hpp' adding 'pythran/pythonic/operator_/is_not.hpp' adding 'pythran/pythonic/operator_/isub.hpp' adding 'pythran/pythonic/operator_/itemgetter.hpp' adding 'pythran/pythonic/operator_/itruediv.hpp' adding 'pythran/pythonic/operator_/ixor.hpp' adding 'pythran/pythonic/operator_/le.hpp' adding 'pythran/pythonic/operator_/lshift.hpp' adding 'pythran/pythonic/operator_/lt.hpp' adding 'pythran/pythonic/operator_/matmul.hpp' adding 'pythran/pythonic/operator_/mod.hpp' adding 'pythran/pythonic/operator_/mul.hpp' adding 'pythran/pythonic/operator_/ne.hpp' adding 'pythran/pythonic/operator_/neg.hpp' adding 'pythran/pythonic/operator_/not_.hpp' adding 'pythran/pythonic/operator_/or_.hpp' adding 'pythran/pythonic/operator_/overloads.hpp' adding 'pythran/pythonic/operator_/pos.hpp' adding 'pythran/pythonic/operator_/pow.hpp' adding 'pythran/pythonic/operator_/rshift.hpp' adding 'pythran/pythonic/operator_/sub.hpp' adding 'pythran/pythonic/operator_/truediv.hpp' adding 'pythran/pythonic/operator_/truth.hpp' adding 'pythran/pythonic/operator_/xor_.hpp' adding 'pythran/pythonic/os/path/join.hpp' adding 'pythran/pythonic/patch/README.rst' adding 'pythran/pythonic/patch/complex' adding 'pythran/pythonic/python/core.hpp' adding 'pythran/pythonic/python/exception_handler.hpp' adding 'pythran/pythonic/random/choice.hpp' adding 'pythran/pythonic/random/expovariate.hpp' adding 'pythran/pythonic/random/gauss.hpp' adding 'pythran/pythonic/random/randint.hpp' adding 'pythran/pythonic/random/random.hpp' adding 'pythran/pythonic/random/randrange.hpp' adding 'pythran/pythonic/random/sample.hpp' adding 'pythran/pythonic/random/seed.hpp' adding 'pythran/pythonic/random/shuffle.hpp' adding 'pythran/pythonic/random/uniform.hpp' adding 'pythran/pythonic/scipy/special/binom.hpp' adding 'pythran/pythonic/scipy/special/chbevl.hpp' adding 'pythran/pythonic/scipy/special/gamma.hpp' adding 'pythran/pythonic/scipy/special/gammaln.hpp' adding 'pythran/pythonic/scipy/special/hankel1.hpp' adding 'pythran/pythonic/scipy/special/hankel2.hpp' adding 'pythran/pythonic/scipy/special/i0.hpp' adding 'pythran/pythonic/scipy/special/i0e.hpp' adding 'pythran/pythonic/scipy/special/iv.hpp' adding 'pythran/pythonic/scipy/special/ivp.hpp' adding 'pythran/pythonic/scipy/special/jv.hpp' adding 'pythran/pythonic/scipy/special/jvp.hpp' adding 'pythran/pythonic/scipy/special/kv.hpp' adding 'pythran/pythonic/scipy/special/kvp.hpp' adding 'pythran/pythonic/scipy/special/spherical_jn.hpp' adding 'pythran/pythonic/scipy/special/spherical_yn.hpp' adding 'pythran/pythonic/scipy/special/yv.hpp' adding 'pythran/pythonic/scipy/special/yvp.hpp' adding 'pythran/pythonic/string/ascii_letters.hpp' adding 'pythran/pythonic/string/ascii_lowercase.hpp' adding 'pythran/pythonic/string/ascii_uppercase.hpp' adding 'pythran/pythonic/string/digits.hpp' adding 'pythran/pythonic/string/find.hpp' adding 'pythran/pythonic/string/hexdigits.hpp' adding 'pythran/pythonic/string/octdigits.hpp' adding 'pythran/pythonic/time/sleep.hpp' adding 'pythran/pythonic/time/time.hpp' adding 'pythran/pythonic/types/NoneType.hpp' adding 'pythran/pythonic/types/assignable.hpp' adding 'pythran/pythonic/types/attr.hpp' adding 'pythran/pythonic/types/bool.hpp' adding 'pythran/pythonic/types/cfun.hpp' adding 'pythran/pythonic/types/combined.hpp' adding 'pythran/pythonic/types/complex.hpp' adding 'pythran/pythonic/types/complex128.hpp' adding 'pythran/pythonic/types/complex256.hpp' adding 'pythran/pythonic/types/complex64.hpp' adding 'pythran/pythonic/types/dict.hpp' adding 'pythran/pythonic/types/dynamic_tuple.hpp' adding 'pythran/pythonic/types/empty_iterator.hpp' adding 'pythran/pythonic/types/exceptions.hpp' adding 'pythran/pythonic/types/file.hpp' adding 'pythran/pythonic/types/finfo.hpp' adding 'pythran/pythonic/types/float.hpp' adding 'pythran/pythonic/types/float128.hpp' adding 'pythran/pythonic/types/float32.hpp' adding 'pythran/pythonic/types/float64.hpp' adding 'pythran/pythonic/types/generator.hpp' adding 'pythran/pythonic/types/int.hpp' adding 'pythran/pythonic/types/int16.hpp' adding 'pythran/pythonic/types/int32.hpp' adding 'pythran/pythonic/types/int64.hpp' adding 'pythran/pythonic/types/int8.hpp' adding 'pythran/pythonic/types/intc.hpp' adding 'pythran/pythonic/types/intp.hpp' adding 'pythran/pythonic/types/list.hpp' adding 'pythran/pythonic/types/ndarray.hpp' adding 'pythran/pythonic/types/nditerator.hpp' adding 'pythran/pythonic/types/numpy_binary_op.hpp' adding 'pythran/pythonic/types/numpy_broadcast.hpp' adding 'pythran/pythonic/types/numpy_expr.hpp' adding 'pythran/pythonic/types/numpy_gexpr.hpp' adding 'pythran/pythonic/types/numpy_iexpr.hpp' adding 'pythran/pythonic/types/numpy_nary_expr.hpp' adding 'pythran/pythonic/types/numpy_op_helper.hpp' adding 'pythran/pythonic/types/numpy_operators.hpp' adding 'pythran/pythonic/types/numpy_texpr.hpp' adding 'pythran/pythonic/types/numpy_unary_op.hpp' adding 'pythran/pythonic/types/numpy_vexpr.hpp' adding 'pythran/pythonic/types/pointer.hpp' adding 'pythran/pythonic/types/raw_array.hpp' adding 'pythran/pythonic/types/set.hpp' adding 'pythran/pythonic/types/slice.hpp' adding 'pythran/pythonic/types/static_if.hpp' adding 'pythran/pythonic/types/str.hpp' adding 'pythran/pythonic/types/traits.hpp' adding 'pythran/pythonic/types/tuple.hpp' adding 'pythran/pythonic/types/uint16.hpp' adding 'pythran/pythonic/types/uint32.hpp' adding 'pythran/pythonic/types/uint64.hpp' adding 'pythran/pythonic/types/uint8.hpp' adding 'pythran/pythonic/types/uintc.hpp' adding 'pythran/pythonic/types/uintp.hpp' adding 'pythran/pythonic/types/variant_functor.hpp' adding 'pythran/pythonic/types/vectorizable_type.hpp' adding 'pythran/pythonic/utils/array_helper.hpp' adding 'pythran/pythonic/utils/broadcast_copy.hpp' adding 'pythran/pythonic/utils/functor.hpp' adding 'pythran/pythonic/utils/fwd.hpp' adding 'pythran/pythonic/utils/int_.hpp' adding 'pythran/pythonic/utils/iterator.hpp' adding 'pythran/pythonic/utils/meta.hpp' adding 'pythran/pythonic/utils/nested_container.hpp' adding 'pythran/pythonic/utils/neutral.hpp' adding 'pythran/pythonic/utils/numpy_conversion.hpp' adding 'pythran/pythonic/utils/numpy_traits.hpp' adding 'pythran/pythonic/utils/pdqsort.hpp' adding 'pythran/pythonic/utils/reserve.hpp' adding 'pythran/pythonic/utils/seq.hpp' adding 'pythran/pythonic/utils/shared_ref.hpp' adding 'pythran/pythonic/utils/tags.hpp' adding 'pythran/pythonic/utils/yield.hpp' adding 'pythran/transformations/__init__.py' adding 'pythran/transformations/expand_builtins.py' adding 'pythran/transformations/expand_globals.py' adding 'pythran/transformations/expand_import_all.py' adding 'pythran/transformations/expand_imports.py' adding 'pythran/transformations/extract_doc_strings.py' adding 'pythran/transformations/false_polymorphism.py' adding 'pythran/transformations/handle_import.py' adding 'pythran/transformations/normalize_compare.py' adding 'pythran/transformations/normalize_exception.py' adding 'pythran/transformations/normalize_ifelse.py' adding 'pythran/transformations/normalize_is_none.py' adding 'pythran/transformations/normalize_method_calls.py' adding 'pythran/transformations/normalize_return.py' adding 'pythran/transformations/normalize_static_if.py' adding 'pythran/transformations/normalize_tuples.py' adding 'pythran/transformations/remove_comprehension.py' adding 'pythran/transformations/remove_fstrings.py' adding 'pythran/transformations/remove_lambdas.py' adding 'pythran/transformations/remove_named_arguments.py' adding 'pythran/transformations/remove_nested_functions.py' adding 'pythran/transformations/unshadow_parameters.py' adding 'pythran/types/__init__.py' adding 'pythran/types/conversion.py' adding 'pythran/types/reorder.py' adding 'pythran/types/signature.py' adding 'pythran/types/tog.py' adding 'pythran/types/type_dependencies.py' adding 'pythran/types/types.py' adding 'pythran-0.11.0.dist-info/AUTHORS' adding 'pythran-0.11.0.dist-info/LICENSE' adding 'pythran-0.11.0.dist-info/METADATA' adding 'pythran-0.11.0.dist-info/WHEEL' adding 'pythran-0.11.0.dist-info/entry_points.txt' adding 'pythran-0.11.0.dist-info/top_level.txt' adding 'pythran-0.11.0.dist-info/RECORD' removing build/bdist.linux-ppc64le/wheel Building wheel for pythran (pyproject.toml): finished with status 'done' Created wheel for pythran: filename=pythran-0.11.0-py3-none-any.whl size=1194780 sha256=e68426bcef2c970da93fca79c5671adf44e65f8f6be358d3db14a697c4e3a4d5 Stored in directory: /builddir/.cache/pip/wheels/d6/37/5b/ec296f5627e482cce9d84c80db021f49e96d01eee73bae1926 Successfully built pythran + PYTHONPATH=/builddir/build/BUILD/pythran-feature-0.11.0 + make -C docs html make: Entering directory '/builddir/build/BUILD/pythran-feature-0.11.0/docs' sphinx-build -b html -d _build/doctrees . _build/html Running Sphinx v4.3.1 making output directory... done building [mo]: targets for 0 po files that are out of date building [html]: targets for 14 source files that are out of date updating environment: [new config] 14 added, 0 changed, 0 removed reading sources... [ 7%] AUTHORS reading sources... [ 14%] CLI reading sources... [ 21%] Changelog reading sources... [ 28%] DEVGUIDE reading sources... [ 35%] EXAMPLES reading sources... [ 42%] INTERNAL reading sources... [ 50%] LICENSE reading sources... [ 57%] MANUAL reading sources... [ 64%] SUPPORT reading sources... [ 71%] TUTORIAL reading sources... [ 78%] examples/Distutils Sample Project reading sources... [ 85%] examples/Third Party Libraries reading sources... [ 92%] index reading sources... [100%] papers/sc2013/Pythran with OpenMP looking for now-outdated files... /builddir/build/BUILD/pythran-feature-0.11.0/docs/Changelog.rst:15: WARNING: Inline emphasis start-string without end-string. none found pickling environment... done checking consistency... /builddir/build/BUILD/pythran-feature-0.11.0/docs/papers/sc2013/Pythran with OpenMP.ipynb: WARNING: document isn't included in any toctree done preparing documents... WARNING: unsupported theme option 'project_nav_name' given done writing output... [ 7%] AUTHORS writing output... [ 14%] CLI writing output... [ 21%] Changelog writing output... [ 28%] DEVGUIDE writing output... [ 35%] EXAMPLES writing output... [ 42%] INTERNAL writing output... [ 50%] LICENSE writing output... [ 57%] MANUAL writing output... [ 64%] SUPPORT writing output... [ 71%] TUTORIAL writing output... [ 78%] examples/Distutils Sample Project writing output... [ 85%] examples/Third Party Libraries writing output... [ 92%] index writing output... [100%] papers/sc2013/Pythran with OpenMP generating indices... done copying notebooks ... [ 33%] examples/Distutils Sample Project.ipynb copying notebooks ... [ 66%] examples/Third Party Libraries.ipynb copying notebooks ... [100%] papers/sc2013/Pythran with OpenMP.ipynb writing additional pages... search done copying static files... done copying extra files... done dumping search index in English (code: en)... done dumping object inventory... done build succeeded, 3 warnings. The HTML pages are in _build/html. Build finished.\nfile:///builddir/build/BUILD/pythran-feature-0.11.0/docs/_build/html/index.html make: Leaving directory '/builddir/build/BUILD/pythran-feature-0.11.0/docs' + rm -rf docs/_build/html/.doctrees docs/_build/html/.buildinfo + RPM_EC=0 ++ jobs -p + exit 0 Executing(%install): /bin/sh -e /var/tmp/rpm-tmp.NLHvwT + umask 022 + cd /builddir/build/BUILD + '[' /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le '!=' / ']' + rm -rf /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le ++ dirname /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le + mkdir -p /builddir/build/BUILDROOT + mkdir /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le + cd pythran-feature-0.11.0 ++ ls /builddir/build/BUILD/pythran-feature-0.11.0/pyproject-wheeldir/pythran-0.11.0-py3-none-any.whl ++ xargs basename --multiple ++ sed -E 's/([^-]+)-([^-]+)-.+\.whl/\1==\2/' + specifier=pythran==0.11.0 + TMPDIR=/builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir + /usr/bin/python3 -m pip install --root /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le --no-deps --disable-pip-version-check --progress-bar off --verbose --ignore-installed --no-warn-script-location --no-index --no-cache-dir --find-links /builddir/build/BUILD/pythran-feature-0.11.0/pyproject-wheeldir pythran==0.11.0 Using pip 21.3.1 from /usr/lib/python3.10/site-packages/pip (python 3.10) Looking in links: /builddir/build/BUILD/pythran-feature-0.11.0/pyproject-wheeldir Processing ./pyproject-wheeldir/pythran-0.11.0-py3-none-any.whl Installing collected packages: pythran Creating /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/bin changing mode of /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/bin/pythran to 755 changing mode of /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/bin/pythran-config to 755 Successfully installed pythran-0.11.0 + '[' -d /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/bin ']' + '[' -f /usr/bin/pathfix3.10.py ']' + pathfix=/usr/bin/pathfix3.10.py + '[' -z s ']' + shebang_flags=-kas + /usr/bin/pathfix3.10.py -pni /usr/bin/python3 -kas /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/bin/pythran /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/bin/pythran-config /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/bin/pythran: updating /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/bin/pythran-config: updating + rm -rfv /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/bin/__pycache__ + rm -f /builddir/build/BUILD/pyproject-ghost-distinfo + site_dirs=() + '[' -d /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/lib/python3.10/site-packages ']' + site_dirs+=("/usr/lib/python3.10/site-packages") + '[' /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/lib64/python3.10/site-packages '!=' /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/lib/python3.10/site-packages ']' + '[' -d /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/lib64/python3.10/site-packages ']' + for site_dir in ${site_dirs[@]} + for distinfo in /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le$site_dir/*.dist-info + echo '%ghost /usr/lib/python3.10/site-packages/pythran-0.11.0.dist-info' + sed -i s/pip/rpm/ /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/lib/python3.10/site-packages/pythran-0.11.0.dist-info/INSTALLER + PYTHONPATH=/usr/lib/rpm/redhat + /usr/bin/python3 -B /usr/lib/rpm/redhat/pyproject_preprocess_record.py --buildroot /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le --record /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/lib/python3.10/site-packages/pythran-0.11.0.dist-info/RECORD --output /builddir/build/BUILD/pyproject-record + rm -fv /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/lib/python3.10/site-packages/pythran-0.11.0.dist-info/RECORD removed '/builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/lib/python3.10/site-packages/pythran-0.11.0.dist-info/RECORD' + rm -fv /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/lib/python3.10/site-packages/pythran-0.11.0.dist-info/REQUESTED removed '/builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/lib/python3.10/site-packages/pythran-0.11.0.dist-info/REQUESTED' ++ wc -l /builddir/build/BUILD/pyproject-ghost-distinfo ++ cut -f1 '-d ' + lines=1 + '[' 1 -ne 1 ']' + /usr/bin/python3 /usr/lib/rpm/redhat/pyproject_save_files.py --output-files /builddir/build/BUILD/pyproject-files --output-modules /builddir/build/BUILD/pyproject-modules --buildroot /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le --sitelib /usr/lib/python3.10/site-packages --sitearch /usr/lib64/python3.10/site-packages --python-version 3.10 --pyproject-record /builddir/build/BUILD/pyproject-record pythran omp + /usr/lib/rpm/check-buildroot + /usr/lib/rpm/redhat/brp-ldconfig + /usr/lib/rpm/brp-compress + /usr/lib/rpm/brp-strip /usr/bin/strip + /usr/lib/rpm/brp-strip-comment-note /usr/bin/strip /usr/bin/objdump + /usr/lib/rpm/redhat/brp-strip-lto /usr/bin/strip + /usr/lib/rpm/brp-strip-static-archive /usr/bin/strip + /usr/lib/rpm/check-rpaths + /usr/lib/rpm/redhat/brp-mangle-shebangs + /usr/lib/rpm/redhat/brp-python-bytecompile '' 1 0 Bytecompiling .py files below /builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/lib/python3.10 using python3.10 + /usr/lib/rpm/redhat/brp-python-hardlink Executing(%check): /bin/sh -e /var/tmp/rpm-tmp.SWL4HE + umask 022 + cd /builddir/build/BUILD + cd pythran-feature-0.11.0 + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 ' + PATH=/builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/sbin + PYTHONPATH=/builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/lib64/python3.10/site-packages:/builddir/build/BUILDROOT/pythran-0.11.0-0.fc36.ppc64le/usr/lib/python3.10/site-packages + PYTHONDONTWRITEBYTECODE=1 + PYTEST_ADDOPTS=' --ignore=/builddir/build/BUILD/pythran-feature-0.11.0/.pyproject-builddir' + /usr/bin/pytest -n auto -k 'not test_numpy_negative_binomial' ============================= test session starts ============================== platform linux -- Python 3.10.1, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 rootdir: /builddir/build/BUILD/pythran-feature-0.11.0 plugins: forked-1.3.0, xdist-2.4.0 gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I gw0 [3479] / gw1 [3479] / gw2 [3479] / gw3 [3479] / gw4 [3479] / gw5 [3479] / gw6 [3479] / gw7 [3479] ......................................s................................. [ 2%] ....................................F...............s................... [ 4%] ........................................................................ [ 6%] .................................................................F...... [ 8%] ..........F...F......................s....................F............. [ 10%] F.....s....................s.......................................F.... [ 12%] ..............s...............s.............s........................... [ 14%] ..........................F..........s..........s....................... [ 16%] ...........................s............................................ [ 18%] ...................s........s.....s............................s.....s.. [ 20%] ......................s................................s................ [ 22%] ..........................s............................................. [ 24%] ..........................................F..F.....F..F....F....F....F.. [ 26%] ........................................................................ [ 28%] ........................................................................ [ 31%] ..............................s......................................... [ 33%] ...s.................................................................... [ 35%] ........................................................................ [ 37%] ........................................................................ [ 39%] ......................................F................................F [ 41%] .....F......F.....F....sss.............................................. [ 43%] ..................ss.....F.....F...F.F.......F...F.....F....F.....F..... [ 45%] F..F......F....F...F...................................s...........F.... [ 47%] ..F..F....F..F....F...F....F..F...F..F.........................F........ [ 49%] ......................................................F.....F...F.....F. [ 51%] ..F.....F.....F....ssF...F.....F.....F.....F....F.........sss.F......F.. [ 53%] .F.......F..............F..s....F.....F.....F........F.....F..FF....F.F. [ 55%] ...FF..F...FF...FF..F.F..F.F.F...FF.F..F....F...F.....FF...FF...FF.F...F [ 57%] F...FF..F.F..FF....F.F.F.F.FF....F.F..F...FF...FF.F..F.FF.....F....FF... [ 60%] .F.....F.F......F.......F.......F..........F.......F......F.......F..... [ 62%] .......F........F......F................................................ [ 64%] ........................F......F.ss..................................... [ 66%] ........................................................................ [ 68%] ......................................................F.....FF....F..F.. [ 70%] ..FF..F.F..FF....F...................................................... [ 72%] ........................................................................ [ 74%] ........................................................................ [ 76%] ........................................................................ [ 78%] ........................................................................ [ 80%] ........................................................................ [ 82%] ......................s.................s............................... [ 84%] .....................................................s.......s.......... [ 86%] ................s...........................................F........... [ 88%] ..........................................s.........................F... [ 91%] ....s...s....s.......................................................... [ 93%] .....................................ss.........ss...................... [ 95%] ........................................................................ [ 97%] ........................s............................................... [ 99%] ....................... [100%] =================================== FAILURES =================================== __________________________ TestBase.test_complex_conj __________________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_complex_conjugate', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp8z3fscwo.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp8z3fscwo.cpp'], output_dir = '/tmp/tmpzy3yx1sc' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpzy3yx1sc/tmp/tmp8z3fscwo.o', ('/tmp/tmp8z3fscwo.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpzy3yx1sc/tmp/tmp8z3fscwo.o', '/tmp/tmp8z3fscwo.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpzy3yx1sc/tmp/tmp8z3fscwo.o', src = '/tmp/tmp8z3fscwo.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp8z3fscwo.cpp -o /tmp/tmpzy3yx1sc/tmp/tmp8z3fscwo.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_complex_conjugate', cxxfile = '/tmp/tmp8z3fscwo.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpf5ulpgy3', buildtmp = '/tmp/tmpzy3yx1sc' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_complex_conjugate', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp8z3fscwo.cpp -o /tmp/tmpzy3yx1sc/tmp/tmp8z3fscwo.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_complex_conj(self): > self.run_test("def complex_conjugate(c): return c.conjugate()", complex(0,1), complex_conjugate=[complex]) pythran/tests/test_base.py:540: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_complex_conjugate', cxxfile = '/tmp/tmp8z3fscwo.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpf5ulpgy3', buildtmp = '/tmp/tmpzy3yx1sc' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp8z3fscwo.cpp -o /tmp/tmpzy3yx1sc/tmp/tmp8z3fscwo.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_complex_conjugate' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpzy3yx1sc/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp8z3fscwo.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/__dispatch__/conjugate.hpp:4, from /tmp/tmp8z3fscwo.cpp:10: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp8z3fscwo.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! __________________________ TestCases.test_cdotc_run0 ___________________________ [gw3] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'cdotc_run00', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpkiwu5btd.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpkiwu5btd.cpp'], output_dir = '/tmp/tmpo2hujcvu' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpo2hujcvu/tmp/tmpkiwu5btd.o', ('/tmp/tmpkiwu5btd.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpo2hujcvu/tmp/tmpkiwu5btd.o', '/tmp/tmpkiwu5btd.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpo2hujcvu/tmp/tmpkiwu5btd.o', src = '/tmp/tmpkiwu5btd.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpkiwu5btd.cpp -o /tmp/tmpo2hujcvu/tmp/tmpkiwu5btd.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'cdotc_run00', cxxfile = '/tmp/tmpkiwu5btd.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpuofflib3', buildtmp = '/tmp/tmpo2hujcvu' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'cdotc_run00', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpkiwu5btd.cpp -o /tmp/tmpo2hujcvu/tmp/tmpkiwu5btd.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def __call__(self): if "unittest.skip" in self.module_code: return self.test_env.skipTest("Marked as skippable") if "unittest.python3.skip" in self.module_code: return self.test_env.skipTest("Marked as skippable") # resolve import locally to where the tests are located sys.path.insert(0, self.test_env.path) > self.test_env.run_test_case(self.module_code, self.module_name, self.runas, module_dir=self.module_dir, **self.specs) pythran/tests/__init__.py:470: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:263: in run_test_case cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'cdotc_run00', cxxfile = '/tmp/tmpkiwu5btd.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpuofflib3', buildtmp = '/tmp/tmpo2hujcvu' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpkiwu5btd.cpp -o /tmp/tmpo2hujcvu/tmp/tmpkiwu5btd.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ------------------------------ Captured log setup ------------------------------ WARNING root:spec.py:94 No pythran specification, nothing will be exported ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'cdotc_run00' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpo2hujcvu/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpkiwu5btd.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/__dispatch__/conjugate.hpp:4, from /tmp/tmpkiwu5btd.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpkiwu5btd.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! __________________________ TestCases.test_crotg_run0 ___________________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'crotg_run00', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpskdt2wbq.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpskdt2wbq.cpp'], output_dir = '/tmp/tmpw7a2fhi7' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpw7a2fhi7/tmp/tmpskdt2wbq.o', ('/tmp/tmpskdt2wbq.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpw7a2fhi7/tmp/tmpskdt2wbq.o', '/tmp/tmpskdt2wbq.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpw7a2fhi7/tmp/tmpskdt2wbq.o', src = '/tmp/tmpskdt2wbq.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpskdt2wbq.cpp -o /tmp/tmpw7a2fhi7/tmp/tmpskdt2wbq.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'crotg_run00', cxxfile = '/tmp/tmpskdt2wbq.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpsqnng381', buildtmp = '/tmp/tmpw7a2fhi7' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'crotg_run00', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpskdt2wbq.cpp -o /tmp/tmpw7a2fhi7/tmp/tmpskdt2wbq.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def __call__(self): if "unittest.skip" in self.module_code: return self.test_env.skipTest("Marked as skippable") if "unittest.python3.skip" in self.module_code: return self.test_env.skipTest("Marked as skippable") # resolve import locally to where the tests are located sys.path.insert(0, self.test_env.path) > self.test_env.run_test_case(self.module_code, self.module_name, self.runas, module_dir=self.module_dir, **self.specs) pythran/tests/__init__.py:470: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:263: in run_test_case cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'crotg_run00', cxxfile = '/tmp/tmpskdt2wbq.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpsqnng381', buildtmp = '/tmp/tmpw7a2fhi7' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpskdt2wbq.cpp -o /tmp/tmpw7a2fhi7/tmp/tmpskdt2wbq.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ------------------------------ Captured log setup ------------------------------ WARNING root:spec.py:94 No pythran specification, nothing will be exported ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'crotg_run00' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpw7a2fhi7/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpskdt2wbq.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/__dispatch__/conjugate.hpp:4, from /tmp/tmpskdt2wbq.cpp:12: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpskdt2wbq.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _________________________ TestCases.test_cronbach_run0 _________________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'cronbach_run00', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpts7_eg05.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpts7_eg05.cpp'], output_dir = '/tmp/tmpa6puymau' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpa6puymau/tmp/tmpts7_eg05.o', ('/tmp/tmpts7_eg05.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpa6puymau/tmp/tmpts7_eg05.o', '/tmp/tmpts7_eg05.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpa6puymau/tmp/tmpts7_eg05.o', src = '/tmp/tmpts7_eg05.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpts7_eg05.cpp -o /tmp/tmpa6puymau/tmp/tmpts7_eg05.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'cronbach_run00', cxxfile = '/tmp/tmpts7_eg05.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpbmz1awj9', buildtmp = '/tmp/tmpa6puymau' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'cronbach_run00', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpts7_eg05.cpp -o /tmp/tmpa6puymau/tmp/tmpts7_eg05.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def __call__(self): if "unittest.skip" in self.module_code: return self.test_env.skipTest("Marked as skippable") if "unittest.python3.skip" in self.module_code: return self.test_env.skipTest("Marked as skippable") # resolve import locally to where the tests are located sys.path.insert(0, self.test_env.path) > self.test_env.run_test_case(self.module_code, self.module_name, self.runas, module_dir=self.module_dir, **self.specs) pythran/tests/__init__.py:470: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:263: in run_test_case cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'cronbach_run00', cxxfile = '/tmp/tmpts7_eg05.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpbmz1awj9', buildtmp = '/tmp/tmpa6puymau' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpts7_eg05.cpp -o /tmp/tmpa6puymau/tmp/tmpts7_eg05.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ------------------------------ Captured log setup ------------------------------ WARNING root:spec.py:94 No pythran specification, nothing will be exported ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'cronbach_run00' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpa6puymau/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpts7_eg05.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpts7_eg05.cpp:24: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpts7_eg05.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! __________________ TestCases.test_matrix_class_distance_run0 ___________________ [gw4] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'matrix_class_distance_run00', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp2gk658mw.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp2gk658mw.cpp'], output_dir = '/tmp/tmpvjanwb3x' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpvjanwb3x/tmp/tmp2gk658mw.o', ('/tmp/tmp2gk658mw.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpvjanwb3x/tmp/tmp2gk658mw.o', '/tmp/tmp2gk658mw.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpvjanwb3x/tmp/tmp2gk658mw.o', src = '/tmp/tmp2gk658mw.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp2gk658mw.cpp -o /tmp/tmpvjanwb3x/tmp/tmp2gk658mw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'matrix_class_distance_run00', cxxfile = '/tmp/tmp2gk658mw.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp3_5a2mtj', buildtmp = '/tmp/tmpvjanwb3x' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'matrix_class_distance_run00', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp2gk658mw.cpp -o /tmp/tmpvjanwb3x/tmp/tmp2gk658mw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def __call__(self): if "unittest.skip" in self.module_code: return self.test_env.skipTest("Marked as skippable") if "unittest.python3.skip" in self.module_code: return self.test_env.skipTest("Marked as skippable") # resolve import locally to where the tests are located sys.path.insert(0, self.test_env.path) > self.test_env.run_test_case(self.module_code, self.module_name, self.runas, module_dir=self.module_dir, **self.specs) pythran/tests/__init__.py:470: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:263: in run_test_case cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'matrix_class_distance_run00', cxxfile = '/tmp/tmp2gk658mw.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp3_5a2mtj', buildtmp = '/tmp/tmpvjanwb3x' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp2gk658mw.cpp -o /tmp/tmpvjanwb3x/tmp/tmp2gk658mw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ------------------------------ Captured log setup ------------------------------ WARNING root:spec.py:94 No pythran specification, nothing will be exported ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'matrix_class_distance_run00' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpvjanwb3x/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp2gk658mw.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conj.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/linalg/norm.hpp:5, from /tmp/tmp2gk658mw.cpp:26: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp2gk658mw.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestCases.test_rand_mat_stat_norun0 ______________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'rand_mat_stat_norun00', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmplni2rxvf.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmplni2rxvf.cpp'], output_dir = '/tmp/tmpdporhida' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpdporhida/tmp/tmplni2rxvf.o', ('/tmp/tmplni2rxvf.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpdporhida/tmp/tmplni2rxvf.o', '/tmp/tmplni2rxvf.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpdporhida/tmp/tmplni2rxvf.o', src = '/tmp/tmplni2rxvf.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmplni2rxvf.cpp -o /tmp/tmpdporhida/tmp/tmplni2rxvf.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'rand_mat_stat_norun00', cxxfile = '/tmp/tmplni2rxvf.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpmadf276_', buildtmp = '/tmp/tmpdporhida' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'rand_mat_stat_norun00', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmplni2rxvf.cpp -o /tmp/tmpdporhida/tmp/tmplni2rxvf.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: module_name = 'rand_mat_stat_norun00' pythrancode = 'import numpy as np\n\nfrom numpy import trace, concatenate, dot\nfrom numpy.random import randn\nfrom numpy.linalg im..., P), 4))\n w[i] = trace(matrix_power(dot(Q.T, Q), 4))\n return np.std(v)/np.mean(v), np.std(w)/np.mean(w)\n' specs = {'rand_mat_stat': ((,),)}, opts = None, cpponly = False pyonly = False, output_file = None module_dir = '/builddir/build/BUILD/pythran-feature-0.11.0/pythran/tests/cases' report_times = False kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} spec_parser = module = error_checker = .error_checker at 0x7fff2c825990> def compile_pythrancode(module_name, pythrancode, specs=None, opts=None, cpponly=False, pyonly=False, output_file=None, module_dir=None, report_times=False, **kwargs): '''Pythran code (string) -> c++ code -> native module if `cpponly` is set to true, return the generated C++ filename if `pyonly` is set to true, prints the generated Python filename, unless `output_file` is set otherwise, return the generated native library filename ''' if pyonly: # Only generate the optimized python code content = generate_py(module_name, pythrancode, opts, module_dir, report_times) if output_file is None: print(content) return None else: tmp_file = _write_temp(content, '.py') output_file = output_file.format('.py') shutil.move(tmp_file, output_file) logger.info("Generated Python source file: " + output_file) # Autodetect the Pythran spec if not given as parameter from pythran.spec import spec_parser if specs is None: specs = spec_parser(pythrancode) # Generate C++, get a PythonModule object module, error_checker = generate_cxx(module_name, pythrancode, specs, opts, module_dir, report_times) if 'ENABLE_PYTHON_MODULE' in kwargs.get('undef_macros', []): module.preamble.insert(0, Line('#undef ENABLE_PYTHON_MODULE')) module.preamble.insert(0, Line('#define PY_MAJOR_VERSION {}'. format(sys.version_info.major))) if cpponly: # User wants only the C++ code tmp_file = _write_temp(str(module), '.cpp') if output_file: output_file = output_file.replace('%{ext}', '.cpp') else: output_file = module_name + ".cpp" shutil.move(tmp_file, output_file) logger.info("Generated C++ source file: " + output_file) else: if not specs: raise ValueError("Empty spec files while generating native module") # Compile to binary try: > output_file = compile_cxxcode(module_name, str(module), output_binary=output_file, **kwargs) pythran/toolchain.py:418: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'rand_mat_stat_norun00' cxxcode = '#include \n#include \n#include \n#include temporary file -> native module. Returns the generated .so. ''' # Get a temporary C++ file to compile fdpath = _write_temp(cxxcode, '.cpp') > output_binary = compile_cxxfile(module_name, fdpath, output_binary, **kwargs) pythran/toolchain.py:355: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'rand_mat_stat_norun00', cxxfile = '/tmp/tmplni2rxvf.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpmadf276_', buildtmp = '/tmp/tmpdporhida' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmplni2rxvf.cpp -o /tmp/tmpdporhida/tmp/tmplni2rxvf.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError During handling of the above exception, another exception occurred: node = env = {'@gen': False, '__pythran_import_numpy': {'Inf': , 'NINF': <....types.tog.TypeVariable object at 0x7fff2c7863b0>, 'b': , ...} non_generic = {, ,...thran.types.tog.TypeVariable object at 0x7fff2c784550>, , ...} def analyse(node, env, non_generic=None): """Computes the type of the expression given by node. The type of the node is computed in the context of the context of the supplied type environment env. Data types can be introduced into the language simply by having a predefined set of identifiers in the initial environment. Environment; this way there is no need to change the syntax or more importantly, the type-checking program when extending the language. Args: node: The root of the abstract syntax tree. env: The type environment is a mapping of expression identifier names to type assignments. non_generic: A set of non-generic variables, or None Returns: The computed type of the expression. Raises: InferenceError: The type of the expression could not be inferred, PythranTypeError: InferenceError with user friendly message + location """ if non_generic is None: non_generic = set() # expr if isinstance(node, ast.Name): if isinstance(node.ctx, (ast.Store)): new_type = TypeVariable() non_generic.add(new_type) env[node.id] = new_type return get_type(node.id, env, non_generic) elif isinstance(node, ast.Constant): if isinstance(node.value, str): return Str() elif isinstance(node.value, int): return Integer() elif isinstance(node.value, float): return Float() elif isinstance(node.value, complex): return Complex() elif node.value is None: return NoneType else: raise NotImplementedError elif isinstance(node, ast.Compare): left_type = analyse(node.left, env, non_generic) comparators_type = [analyse(comparator, env, non_generic) for comparator in node.comparators] ops_type = [analyse(op, env, non_generic) for op in node.ops] prev_type = left_type result_type = TypeVariable() for op_type, comparator_type in zip(ops_type, comparators_type): try: unify(Function([prev_type, comparator_type], result_type), op_type) prev_type = comparator_type except InferenceError: raise PythranTypeError( "Invalid comparison, between `{}` and `{}`".format( prev_type, comparator_type ), node) return result_type elif isinstance(node, ast.Call): if is_getattr(node): self_type = analyse(node.args[0], env, non_generic) attr_name = node.args[1].value _, attr_signature = attributes[attr_name] attr_type = tr(attr_signature) result_type = TypeVariable() try: unify(Function([self_type], result_type), attr_type) except InferenceError: if isinstance(prune(attr_type), MultiType): msg = 'no attribute found, tried:\n{}'.format(attr_type) else: msg = 'tried {}'.format(attr_type) raise PythranTypeError( "Invalid attribute for getattr call with self" "of type `{}`, {}".format(self_type, msg), node) else: fun_type = analyse(node.func, env, non_generic) arg_types = [analyse(arg, env, non_generic) for arg in node.args] result_type = TypeVariable() try: > unify(Function(arg_types, result_type), fun_type) pythran/types/tog.py:610: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ t1 = t2 = def unify(t1, t2): """Unify the two types t1 and t2. Makes the types t1 and t2 the same. Args: t1: The first type to be made equivalent t2: The second type to be be equivalent Returns: None Raises: InferenceError: Raised if the types cannot be unified. """ a = prune(t1) b = prune(t2) if isinstance(a, TypeVariable): if a != b: if occurs_in_type(a, b): raise InferenceError("recursive unification") a.instance = b elif isinstance(b, TypeVariable): unify(b, a) elif isinstance(a, TypeOperator) and a.name == 'any': return elif isinstance(b, TypeOperator) and b.name == 'any': return elif isinstance(a, TypeOperator) and isinstance(b, TypeOperator): if len(a.types) != len(b.types): raise InferenceError("Type length differ") else: if a.name != b.name: raise InferenceError("Type name differ") try: for p, q in zip(a.types, b.types): unify(p, q) except InferenceError: raise elif isinstance(a, MultiType) and isinstance(b, MultiType): if len(a.types) != len(b.types): raise InferenceError("Type lenght differ") for p, q in zip(a.types, b.types): unify(p, q) elif isinstance(b, MultiType): > return unify(b, a) pythran/types/tog.py:1277: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ t1 = t2 = def unify(t1, t2): """Unify the two types t1 and t2. Makes the types t1 and t2 the same. Args: t1: The first type to be made equivalent t2: The second type to be be equivalent Returns: None Raises: InferenceError: Raised if the types cannot be unified. """ a = prune(t1) b = prune(t2) if isinstance(a, TypeVariable): if a != b: if occurs_in_type(a, b): raise InferenceError("recursive unification") a.instance = b elif isinstance(b, TypeVariable): unify(b, a) elif isinstance(a, TypeOperator) and a.name == 'any': return elif isinstance(b, TypeOperator) and b.name == 'any': return elif isinstance(a, TypeOperator) and isinstance(b, TypeOperator): if len(a.types) != len(b.types): raise InferenceError("Type length differ") else: if a.name != b.name: raise InferenceError("Type name differ") try: for p, q in zip(a.types, b.types): unify(p, q) except InferenceError: raise elif isinstance(a, MultiType) and isinstance(b, MultiType): if len(a.types) != len(b.types): raise InferenceError("Type lenght differ") for p, q in zip(a.types, b.types): unify(p, q) elif isinstance(b, MultiType): return unify(b, a) elif isinstance(a, MultiType): types = [] for t in a.types: try: t_clone = fresh(t, {}) b_clone = fresh(b, {}) unify(t_clone, b_clone) types.append(t) except InferenceError: pass if types: if len(types) == 1: unify(clone(types[0]), b) else: # too many overloads are found, # so extract as many information as we can, # and leave the remaining over-approximated def try_unify(t, ts): if isinstance(t, TypeVariable): return if any(isinstance(tp, TypeVariable) for tp in ts): return if any(len(tp.types) != len(t.types) for tp in ts): return for i, tt in enumerate(t.types): its = [prune(tp.types[i]) for tp in ts] if any(isinstance(it, TypeVariable) for it in its): continue it0 = its[0] it0ntypes = len(it0.types) if all(((it.name == it0.name) and (len(it.types) == it0ntypes)) for it in its): ntypes = [TypeVariable() for _ in range(it0ntypes)] new_tt = TypeOperator(it0.name, ntypes) new_tt.__class__ = it0.__class__ unify(tt, new_tt) try_unify(prune(tt), [prune(it) for it in its]) try_unify(b, types) else: > raise InferenceError("No overload") E pythran.types.tog.InferenceError: No overload pythran/types/tog.py:1318: InferenceError During handling of the above exception, another exception occurred: self = def __call__(self): if "unittest.skip" in self.module_code: return self.test_env.skipTest("Marked as skippable") if "unittest.python3.skip" in self.module_code: return self.test_env.skipTest("Marked as skippable") # resolve import locally to where the tests are located sys.path.insert(0, self.test_env.path) > self.test_env.run_test_case(self.module_code, self.module_name, self.runas, module_dir=self.module_dir, **self.specs) pythran/tests/__init__.py:470: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:263: in run_test_case cxx_compiled = compile_pythrancode( pythran/toolchain.py:425: in compile_pythrancode error_checker() pythran/toolchain.py:170: in error_checker types = tog.typecheck(ir) pythran/types/tog.py:1423: in typecheck types = analyse(node, {'builtins': MODULES['builtins']}) pythran/types/tog.py:921: in analyse analyse_body(node.body, env, non_generic) pythran/types/tog.py:504: in analyse_body analyse(stmt, env, non_generic) pythran/types/tog.py:905: in analyse analyse_body(node.body, new_env, new_non_generic) pythran/types/tog.py:504: in analyse_body analyse(stmt, env, non_generic) pythran/types/tog.py:1029: in analyse analyse_body(node.body, env, non_generic) pythran/types/tog.py:504: in analyse_body analyse(stmt, env, non_generic) pythran/types/tog.py:948: in analyse defn_type = analyse(node.value, env, non_generic) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ node = env = {'@gen': False, '__pythran_import_numpy': {'Inf': , 'NINF': <....types.tog.TypeVariable object at 0x7fff2c7863b0>, 'b': , ...} non_generic = {, ,...thran.types.tog.TypeVariable object at 0x7fff2c784550>, , ...} def analyse(node, env, non_generic=None): """Computes the type of the expression given by node. The type of the node is computed in the context of the context of the supplied type environment env. Data types can be introduced into the language simply by having a predefined set of identifiers in the initial environment. Environment; this way there is no need to change the syntax or more importantly, the type-checking program when extending the language. Args: node: The root of the abstract syntax tree. env: The type environment is a mapping of expression identifier names to type assignments. non_generic: A set of non-generic variables, or None Returns: The computed type of the expression. Raises: InferenceError: The type of the expression could not be inferred, PythranTypeError: InferenceError with user friendly message + location """ if non_generic is None: non_generic = set() # expr if isinstance(node, ast.Name): if isinstance(node.ctx, (ast.Store)): new_type = TypeVariable() non_generic.add(new_type) env[node.id] = new_type return get_type(node.id, env, non_generic) elif isinstance(node, ast.Constant): if isinstance(node.value, str): return Str() elif isinstance(node.value, int): return Integer() elif isinstance(node.value, float): return Float() elif isinstance(node.value, complex): return Complex() elif node.value is None: return NoneType else: raise NotImplementedError elif isinstance(node, ast.Compare): left_type = analyse(node.left, env, non_generic) comparators_type = [analyse(comparator, env, non_generic) for comparator in node.comparators] ops_type = [analyse(op, env, non_generic) for op in node.ops] prev_type = left_type result_type = TypeVariable() for op_type, comparator_type in zip(ops_type, comparators_type): try: unify(Function([prev_type, comparator_type], result_type), op_type) prev_type = comparator_type except InferenceError: raise PythranTypeError( "Invalid comparison, between `{}` and `{}`".format( prev_type, comparator_type ), node) return result_type elif isinstance(node, ast.Call): if is_getattr(node): self_type = analyse(node.args[0], env, non_generic) attr_name = node.args[1].value _, attr_signature = attributes[attr_name] attr_type = tr(attr_signature) result_type = TypeVariable() try: unify(Function([self_type], result_type), attr_type) except InferenceError: if isinstance(prune(attr_type), MultiType): msg = 'no attribute found, tried:\n{}'.format(attr_type) else: msg = 'tried {}'.format(attr_type) raise PythranTypeError( "Invalid attribute for getattr call with self" "of type `{}`, {}".format(self_type, msg), node) else: fun_type = analyse(node.func, env, non_generic) arg_types = [analyse(arg, env, non_generic) for arg in node.args] result_type = TypeVariable() try: unify(Function(arg_types, result_type), fun_type) except InferenceError: # recover original type fun_type = analyse(node.func, env, non_generic) if isinstance(prune(fun_type), MultiType): msg = 'no overload found, tried:\n{}'.format(fun_type) else: msg = 'tried {}'.format(fun_type) > raise PythranTypeError( "Invalid argument type for function call to " "`Callable[[{}], ...]`, {}" .format(', '.join('{}'.format(at) for at in arg_types), msg), node) E File "", line 18 E pythran.types.tog.PythranTypeError: Invalid argument type for function call to `Callable[[Tuple[int, T128, T129], int], ...]`, no overload found, tried: E Callable[[Iterable[Iterable[Iterable[bool]]]], Array[2d, bool]] E Callable[[Iterable[Iterable[Iterable[complex]]]], Array[2d, complex]] E Callable[[Iterable[Iterable[Iterable[float]]]], Array[2d, float]] E Callable[[Iterable[Iterable[Iterable[int]]]], Array[2d, int]] E Callable[[Iterable[Iterable[bool]]], Array[1d, bool]] E Callable[[Iterable[Iterable[complex]]], Array[1d, complex]] E Callable[[Iterable[Iterable[float]]], Array[1d, float]] E Callable[[Iterable[Iterable[int]]], Array[1d, int]] E Callable[[Tuple[int, T0, T1]], Array[1d, bool]] E Callable[[Tuple[int, T10, T11]], Array[1d, bool]] E Callable[[Tuple[int, T100, T101]], Array[2d, float]] E Callable[[Tuple[int, T102, T103]], Array[2d, float]] E Callable[[Tuple[int, T104, T105]], Array[2d, float]] E Callable[[Tuple[int, T106, T107]], Array[2d, float]] E Callable[[Tuple[int, T108, T109]], Array[2d, float]] E Callable[[Tuple[int, T110, T111]], Array[2d, float]] E Callable[[Tuple[int, T112, T113]], Array[2d, complex]] E Callable[[Tuple[int, T114, T115]], Array[2d, complex]] E Callable[[Tuple[int, T116, T117]], Array[2d, complex]] E Callable[[Tuple[int, T118, T119]], Array[2d, complex]] E Callable[[Tuple[int, T12, T13]], Array[1d, bool]] E Callable[[Tuple[int, T120, T121]], Array[2d, complex]] E Callable[[Tuple[int, T122, T123]], Array[2d, complex]] E Callable[[Tuple[int, T124, T125]], Array[2d, complex]] E Callable[[Tuple[int, T126, T127]], Array[2d, complex]] E Callable[[Tuple[int, T14, T15]], Array[1d, bool]] E Callable[[Tuple[int, T16, T17]], Array[1d, int]] E Callable[[Tuple[int, T18, T19]], Array[1d, int]] E Callable[[Tuple[int, T2, T3]], Array[1d, bool]] E Callable[[Tuple[int, T20, T21]], Array[1d, int]] E Callable[[Tuple[int, T22, T23]], Array[1d, int]] E Callable[[Tuple[int, T24, T25]], Array[1d, int]] E Callable[[Tuple[int, T26, T27]], Array[1d, int]] E Callable[[Tuple[int, T28, T29]], Array[1d, int]] E Callable[[Tuple[int, T30, T31]], Array[1d, int]] E Callable[[Tuple[int, T32, T33]], Array[1d, float]] E Callable[[Tuple[int, T34, T35]], Array[1d, float]] E Callable[[Tuple[int, T36, T37]], Array[1d, float]] E Callable[[Tuple[int, T38, T39]], Array[1d, float]] E Callable[[Tuple[int, T4, T5]], Array[1d, bool]] E Callable[[Tuple[int, T40, T41]], Array[1d, float]] E Callable[[Tuple[int, T42, T43]], Array[1d, float]] E Callable[[Tuple[int, T44, T45]], Array[1d, float]] E Callable[[Tuple[int, T46, T47]], Array[1d, float]] E Callable[[Tuple[int, T48, T49]], Array[1d, complex]] E Callable[[Tuple[int, T50, T51]], Array[1d, complex]] E Callable[[Tuple[int, T52, T53]], Array[1d, complex]] E Callable[[Tuple[int, T54, T55]], Array[1d, complex]] E Callable[[Tuple[int, T56, T57]], Array[1d, complex]] E Callable[[Tuple[int, T58, T59]], Array[1d, complex]] E Callable[[Tuple[int, T6, T7]], Array[1d, bool]] E Callable[[Tuple[int, T60, T61]], Array[1d, complex]] E Callable[[Tuple[int, T62, T63]], Array[1d, complex]] E Callable[[Tuple[int, T64, T65]], Array[2d, bool]] E Callable[[Tuple[int, T66, T67]], Array[2d, bool]] E Callable[[Tuple[int, T68, T69]], Array[2d, bool]] E Callable[[Tuple[int, T70, T71]], Array[2d, bool]] E Callable[[Tuple[int, T72, T73]], Array[2d, bool]] E Callable[[Tuple[int, T74, T75]], Array[2d, bool]] E Callable[[Tuple[int, T76, T77]], Array[2d, bool]] E Callable[[Tuple[int, T78, T79]], Array[2d, bool]] E Callable[[Tuple[int, T8, T9]], Array[1d, bool]] E Callable[[Tuple[int, T80, T81]], Array[2d, int]] E Callable[[Tuple[int, T82, T83]], Array[2d, int]] E Callable[[Tuple[int, T84, T85]], Array[2d, int]] E Callable[[Tuple[int, T86, T87]], Array[2d, int]] E Callable[[Tuple[int, T88, T89]], Array[2d, int]] E Callable[[Tuple[int, T90, T91]], Array[2d, int]] E Callable[[Tuple[int, T92, T93]], Array[2d, int]] E Callable[[Tuple[int, T94, T95]], Array[2d, int]] E Callable[[Tuple[int, T96, T97]], Array[2d, float]] E Callable[[Tuple[int, T98, T99]], Array[2d, float]] pythran/types/tog.py:618: PythranTypeError ------------------------------ Captured log setup ------------------------------ WARNING root:spec.py:94 No pythran specification, nothing will be exported ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'rand_mat_stat_norun00' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpdporhida/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmplni2rxvf.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/std_.hpp:5, from /tmp/tmplni2rxvf.cpp:32: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmplni2rxvf.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... ______________________ TestComplex.test_complex256_array4 ______________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_complex256_array4', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpyzjfyazw.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpyzjfyazw.cpp'], output_dir = '/tmp/tmpumqgnugz' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpumqgnugz/tmp/tmpyzjfyazw.o', ('/tmp/tmpyzjfyazw.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpumqgnugz/tmp/tmpyzjfyazw.o', '/tmp/tmpyzjfyazw.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpumqgnugz/tmp/tmpyzjfyazw.o', src = '/tmp/tmpyzjfyazw.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpyzjfyazw.cpp -o /tmp/tmpumqgnugz/tmp/tmpyzjfyazw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_complex256_array4', cxxfile = '/tmp/tmpyzjfyazw.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmphy80fdtz', buildtmp = '/tmp/tmpumqgnugz' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_complex256_array4', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpyzjfyazw.cpp -o /tmp/tmpumqgnugz/tmp/tmpyzjfyazw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = @unittest.skipIf(not has_float128, "not float128") def test_complex256_array4(self): > self.run_test('def complex256_array4(x): return x.conj(), x.sum()', np.array([2j, 2], dtype=np.complex256)** 7, complex256_array4=[NDArray[np.complex256, :]]) pythran/tests/test_complex.py:174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_complex256_array4', cxxfile = '/tmp/tmpyzjfyazw.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmphy80fdtz', buildtmp = '/tmp/tmpumqgnugz' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpyzjfyazw.cpp -o /tmp/tmpumqgnugz/tmp/tmpyzjfyazw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_complex256_array4' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpumqgnugz/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpyzjfyazw.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /tmp/tmpyzjfyazw.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpyzjfyazw.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! __________________________ TestComplex.test_conjugate __________________________ [gw2] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_test_conjugate', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpfh5j_xo9.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpfh5j_xo9.cpp'], output_dir = '/tmp/tmpbwuk4tt2' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpbwuk4tt2/tmp/tmpfh5j_xo9.o', ('/tmp/tmpfh5j_xo9.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpbwuk4tt2/tmp/tmpfh5j_xo9.o', '/tmp/tmpfh5j_xo9.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpbwuk4tt2/tmp/tmpfh5j_xo9.o', src = '/tmp/tmpfh5j_xo9.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfh5j_xo9.cpp -o /tmp/tmpbwuk4tt2/tmp/tmpfh5j_xo9.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_test_conjugate', cxxfile = '/tmp/tmpfh5j_xo9.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpa7on4lnb', buildtmp = '/tmp/tmpbwuk4tt2' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_test_conjugate', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfh5j_xo9.cpp -o /tmp/tmpbwuk4tt2/tmp/tmpfh5j_xo9.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_conjugate(self): """ Check complex conjugate. Checked for: * Method and numpy function call * conj and conjugate for each of them * complex and array (1 and 2 D) """ > self.run_test(""" def test_conjugate(c, a, a2d): import numpy as np return (np.conj(c), np.conj(a), a2d.conj(), np.conjugate(c), np.conjugate(a), a2d.conjugate()) """, 3 + 2j, np.array([3 + 2j]), np.array([[3 + 2j]]), test_conjugate=[np.complex128, NDArray[np.complex128, :], NDArray[complex, :, :]]) pythran/tests/test_complex.py:47: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_test_conjugate', cxxfile = '/tmp/tmpfh5j_xo9.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpa7on4lnb', buildtmp = '/tmp/tmpbwuk4tt2' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfh5j_xo9.cpp -o /tmp/tmpbwuk4tt2/tmp/tmpfh5j_xo9.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_test_conjugate' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpbwuk4tt2/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpfh5j_xo9.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/__dispatch__/conjugate.hpp:4, from /tmp/tmpfh5j_xo9.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpfh5j_xo9.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________________ TestNumpyFunc0.test_var3 ___________________________ [gw2] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var3', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp6atjd8_p.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp6atjd8_p.cpp'], output_dir = '/tmp/tmp7aiuad1m' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp7aiuad1m/tmp/tmp6atjd8_p.o', ('/tmp/tmp6atjd8_p.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp7aiuad1m/tmp/tmp6atjd8_p.o', '/tmp/tmp6atjd8_p.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp7aiuad1m/tmp/tmp6atjd8_p.o', src = '/tmp/tmp6atjd8_p.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp6atjd8_p.cpp -o /tmp/tmp7aiuad1m/tmp/tmp6atjd8_p.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_var3', cxxfile = '/tmp/tmp6atjd8_p.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp_tj1admg', buildtmp = '/tmp/tmp7aiuad1m' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var3', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp6atjd8_p.cpp -o /tmp/tmp7aiuad1m/tmp/tmp6atjd8_p.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_var3(self): > self.run_test("def np_var3(a): from numpy import var ; return var(a, 0)", numpy.array([[[1, 2], [3, 4.]]]), np_var3=[NDArray[float,:,:,:]]) pythran/tests/test_numpy_func0.py:322: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_var3', cxxfile = '/tmp/tmp6atjd8_p.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp_tj1admg', buildtmp = '/tmp/tmp7aiuad1m' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp6atjd8_p.cpp -o /tmp/tmp7aiuad1m/tmp/tmp6atjd8_p.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_var3' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp7aiuad1m/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp6atjd8_p.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp6atjd8_p.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp6atjd8_p.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________________ TestNumpyFunc0.test_var4 ___________________________ [gw2] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var4', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpw4qljilw.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpw4qljilw.cpp'], output_dir = '/tmp/tmpzs_g2ubl' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpzs_g2ubl/tmp/tmpw4qljilw.o', ('/tmp/tmpw4qljilw.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpzs_g2ubl/tmp/tmpw4qljilw.o', '/tmp/tmpw4qljilw.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpzs_g2ubl/tmp/tmpw4qljilw.o', src = '/tmp/tmpw4qljilw.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpw4qljilw.cpp -o /tmp/tmpzs_g2ubl/tmp/tmpw4qljilw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_var4', cxxfile = '/tmp/tmpw4qljilw.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpjjd8omqi', buildtmp = '/tmp/tmpzs_g2ubl' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var4', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpw4qljilw.cpp -o /tmp/tmpzs_g2ubl/tmp/tmpw4qljilw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_var4(self): > self.run_test("def np_var4(a): from numpy import var ; return var(a, 1)", numpy.array([[[1, 2], [3, 4.]]]), np_var4=[NDArray[float,:,:,:]]) pythran/tests/test_numpy_func0.py:325: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_var4', cxxfile = '/tmp/tmpw4qljilw.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpjjd8omqi', buildtmp = '/tmp/tmpzs_g2ubl' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpw4qljilw.cpp -o /tmp/tmpzs_g2ubl/tmp/tmpw4qljilw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_var4' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpzs_g2ubl/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpw4qljilw.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpw4qljilw.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpw4qljilw.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________________ TestNumpyFunc0.test_var5 ___________________________ [gw2] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var5', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpfyw5v6a3.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpfyw5v6a3.cpp'], output_dir = '/tmp/tmpc13h3tuw' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpc13h3tuw/tmp/tmpfyw5v6a3.o', ('/tmp/tmpfyw5v6a3.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpc13h3tuw/tmp/tmpfyw5v6a3.o', '/tmp/tmpfyw5v6a3.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpc13h3tuw/tmp/tmpfyw5v6a3.o', src = '/tmp/tmpfyw5v6a3.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfyw5v6a3.cpp -o /tmp/tmpc13h3tuw/tmp/tmpfyw5v6a3.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_var5', cxxfile = '/tmp/tmpfyw5v6a3.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpu6jnk9gy', buildtmp = '/tmp/tmpc13h3tuw' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var5', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfyw5v6a3.cpp -o /tmp/tmpc13h3tuw/tmp/tmpfyw5v6a3.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_var5(self): > self.run_test("def np_var5(a): from numpy import var ; return var(a, 2)", numpy.array([[[1, 2], [3, 4.]]]), np_var5=[NDArray[float,:,:,:]]) pythran/tests/test_numpy_func0.py:328: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_var5', cxxfile = '/tmp/tmpfyw5v6a3.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpu6jnk9gy', buildtmp = '/tmp/tmpc13h3tuw' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfyw5v6a3.cpp -o /tmp/tmpc13h3tuw/tmp/tmpfyw5v6a3.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_var5' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpc13h3tuw/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpfyw5v6a3.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpfyw5v6a3.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpfyw5v6a3.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________________ TestNumpyFunc0.test_var6 ___________________________ [gw2] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var6', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp2gto6szx.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp2gto6szx.cpp'], output_dir = '/tmp/tmp79se1qy7' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp79se1qy7/tmp/tmp2gto6szx.o', ('/tmp/tmp2gto6szx.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp79se1qy7/tmp/tmp2gto6szx.o', '/tmp/tmp2gto6szx.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp79se1qy7/tmp/tmp2gto6szx.o', src = '/tmp/tmp2gto6szx.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp2gto6szx.cpp -o /tmp/tmp79se1qy7/tmp/tmp2gto6szx.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_var6', cxxfile = '/tmp/tmp2gto6szx.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpn26mvh4j', buildtmp = '/tmp/tmp79se1qy7' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var6', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp2gto6szx.cpp -o /tmp/tmp79se1qy7/tmp/tmp2gto6szx.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_var6(self): > self.run_test("def np_var6(a): from numpy import var ; return var(1j * a)", numpy.array([[[1, 2], [3, 4.]]]), np_var6=[NDArray[float,:,:,:]]) pythran/tests/test_numpy_func0.py:331: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_var6', cxxfile = '/tmp/tmp2gto6szx.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpn26mvh4j', buildtmp = '/tmp/tmp79se1qy7' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp2gto6szx.cpp -o /tmp/tmp79se1qy7/tmp/tmp2gto6szx.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_var6' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp79se1qy7/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp2gto6szx.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp2gto6szx.cpp:15: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp2gto6szx.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________________ TestNumpyFunc0.test_var7 ___________________________ [gw2] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var7', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmphzm_mom2.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmphzm_mom2.cpp'], output_dir = '/tmp/tmpdc2plof0' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpdc2plof0/tmp/tmphzm_mom2.o', ('/tmp/tmphzm_mom2.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpdc2plof0/tmp/tmphzm_mom2.o', '/tmp/tmphzm_mom2.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpdc2plof0/tmp/tmphzm_mom2.o', src = '/tmp/tmphzm_mom2.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphzm_mom2.cpp -o /tmp/tmpdc2plof0/tmp/tmphzm_mom2.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_var7', cxxfile = '/tmp/tmphzm_mom2.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp0du6nmfb', buildtmp = '/tmp/tmpdc2plof0' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var7', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphzm_mom2.cpp -o /tmp/tmpdc2plof0/tmp/tmphzm_mom2.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_var7(self): > self.run_test("def np_var7(a): from numpy import var ; return var(1j * a, 2)", numpy.array([[[1, 2], [3, 4.]]]), np_var7=[NDArray[float,:,:,:]]) pythran/tests/test_numpy_func0.py:334: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_var7', cxxfile = '/tmp/tmphzm_mom2.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp0du6nmfb', buildtmp = '/tmp/tmpdc2plof0' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphzm_mom2.cpp -o /tmp/tmpdc2plof0/tmp/tmphzm_mom2.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_var7' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpdc2plof0/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmphzm_mom2.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmphzm_mom2.cpp:15: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmphzm_mom2.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________________ TestNumpyFunc0.test_var8 ___________________________ [gw2] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var8', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpywu3ocf_.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpywu3ocf_.cpp'], output_dir = '/tmp/tmpehsx1v4v' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpehsx1v4v/tmp/tmpywu3ocf_.o', ('/tmp/tmpywu3ocf_.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpehsx1v4v/tmp/tmpywu3ocf_.o', '/tmp/tmpywu3ocf_.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpehsx1v4v/tmp/tmpywu3ocf_.o', src = '/tmp/tmpywu3ocf_.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpywu3ocf_.cpp -o /tmp/tmpehsx1v4v/tmp/tmpywu3ocf_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_var8', cxxfile = '/tmp/tmpywu3ocf_.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpgvtuvjmt', buildtmp = '/tmp/tmpehsx1v4v' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var8', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpywu3ocf_.cpp -o /tmp/tmpehsx1v4v/tmp/tmpywu3ocf_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_var8(self): > self.run_test("def np_var8(a): from numpy import var ; return var(1j * a, 2)", numpy.array([[[1, 2], [3, 4]]]), np_var8=[NDArray[int,:,:,:]]) pythran/tests/test_numpy_func0.py:337: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_var8', cxxfile = '/tmp/tmpywu3ocf_.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpgvtuvjmt', buildtmp = '/tmp/tmpehsx1v4v' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpywu3ocf_.cpp -o /tmp/tmpehsx1v4v/tmp/tmpywu3ocf_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_var8' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpehsx1v4v/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpywu3ocf_.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpywu3ocf_.cpp:15: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpywu3ocf_.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________________ TestNumpyFunc0.test_var9 ___________________________ [gw2] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var9', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpe7lakywz.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpe7lakywz.cpp'], output_dir = '/tmp/tmpq4tv0bei' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpq4tv0bei/tmp/tmpe7lakywz.o', ('/tmp/tmpe7lakywz.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpq4tv0bei/tmp/tmpe7lakywz.o', '/tmp/tmpe7lakywz.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpq4tv0bei/tmp/tmpe7lakywz.o', src = '/tmp/tmpe7lakywz.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpe7lakywz.cpp -o /tmp/tmpq4tv0bei/tmp/tmpe7lakywz.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_var9', cxxfile = '/tmp/tmpe7lakywz.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpgvro4_dz', buildtmp = '/tmp/tmpq4tv0bei' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var9', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpe7lakywz.cpp -o /tmp/tmpq4tv0bei/tmp/tmpe7lakywz.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_var9(self): > self.run_test("def np_var9(a): from numpy import var ; return var(1j * a)", numpy.array([[[1, 2], [3, 4]]]), np_var9=[NDArray[int,:,:,:]]) pythran/tests/test_numpy_func0.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_var9', cxxfile = '/tmp/tmpe7lakywz.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpgvro4_dz', buildtmp = '/tmp/tmpq4tv0bei' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpe7lakywz.cpp -o /tmp/tmpq4tv0bei/tmp/tmpe7lakywz.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_var9' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpq4tv0bei/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpe7lakywz.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpe7lakywz.cpp:15: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpe7lakywz.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyFunc1.test_transpose_expr2 ______________________ [gw2] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_transpose_expr2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpx0kko_72.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpx0kko_72.cpp'], output_dir = '/tmp/tmparo7d7wd' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmparo7d7wd/tmp/tmpx0kko_72.o', ('/tmp/tmpx0kko_72.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmparo7d7wd/tmp/tmpx0kko_72.o', '/tmp/tmpx0kko_72.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmparo7d7wd/tmp/tmpx0kko_72.o', src = '/tmp/tmpx0kko_72.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpx0kko_72.cpp -o /tmp/tmparo7d7wd/tmp/tmpx0kko_72.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_transpose_expr2', cxxfile = '/tmp/tmpx0kko_72.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp3owpncp5', buildtmp = '/tmp/tmparo7d7wd' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_transpose_expr2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpx0kko_72.cpp -o /tmp/tmparo7d7wd/tmp/tmpx0kko_72.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_transpose_expr2(self): > self.run_test("def np_transpose_expr2(a): import numpy as np; return np.conj(a).T", 1j * numpy.ones(6).reshape(2,3), np_transpose_expr2=[NDArray[complex,:,:]]) pythran/tests/test_numpy_func1.py:202: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_transpose_expr2', cxxfile = '/tmp/tmpx0kko_72.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp3owpncp5', buildtmp = '/tmp/tmparo7d7wd' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpx0kko_72.cpp -o /tmp/tmparo7d7wd/tmp/tmpx0kko_72.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_transpose_expr2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmparo7d7wd/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpx0kko_72.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /tmp/tmpx0kko_72.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpx0kko_72.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________________ TestNumpyFunc0.test_std0 ___________________________ [gw6] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_std0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpk8whh8_x.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpk8whh8_x.cpp'], output_dir = '/tmp/tmp6ktlj124' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp6ktlj124/tmp/tmpk8whh8_x.o', ('/tmp/tmpk8whh8_x.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp6ktlj124/tmp/tmpk8whh8_x.o', '/tmp/tmpk8whh8_x.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp6ktlj124/tmp/tmpk8whh8_x.o', src = '/tmp/tmpk8whh8_x.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpk8whh8_x.cpp -o /tmp/tmp6ktlj124/tmp/tmpk8whh8_x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_std0', cxxfile = '/tmp/tmpk8whh8_x.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpvesj66o7', buildtmp = '/tmp/tmp6ktlj124' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_std0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpk8whh8_x.cpp -o /tmp/tmp6ktlj124/tmp/tmpk8whh8_x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_std0(self): > self.run_test("def np_std0(a): from numpy import std ; return std(a)", numpy.array([[[1, 2], [3, 4]]]), np_std0=[NDArray[int, :, :, :]]) pythran/tests/test_numpy_func0.py:343: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_std0', cxxfile = '/tmp/tmpk8whh8_x.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpvesj66o7', buildtmp = '/tmp/tmp6ktlj124' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpk8whh8_x.cpp -o /tmp/tmp6ktlj124/tmp/tmpk8whh8_x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_std0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp6ktlj124/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpk8whh8_x.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/std_.hpp:5, from /tmp/tmpk8whh8_x.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpk8whh8_x.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________________ TestNumpyFunc0.test_std1 ___________________________ [gw6] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_std1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp4w2n0o5u.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp4w2n0o5u.cpp'], output_dir = '/tmp/tmp5l1n5zfv' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp5l1n5zfv/tmp/tmp4w2n0o5u.o', ('/tmp/tmp4w2n0o5u.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp5l1n5zfv/tmp/tmp4w2n0o5u.o', '/tmp/tmp4w2n0o5u.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp5l1n5zfv/tmp/tmp4w2n0o5u.o', src = '/tmp/tmp4w2n0o5u.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp4w2n0o5u.cpp -o /tmp/tmp5l1n5zfv/tmp/tmp4w2n0o5u.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_std1', cxxfile = '/tmp/tmp4w2n0o5u.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp8jxhyvxs', buildtmp = '/tmp/tmp5l1n5zfv' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_std1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp4w2n0o5u.cpp -o /tmp/tmp5l1n5zfv/tmp/tmp4w2n0o5u.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_std1(self): > self.run_test("def np_std1(a): from numpy import std ; return std(a, 0)", numpy.array([[[1, 2], [3, 4]]]), np_std1=[NDArray[int, :, :, :]]) pythran/tests/test_numpy_func0.py:346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_std1', cxxfile = '/tmp/tmp4w2n0o5u.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp8jxhyvxs', buildtmp = '/tmp/tmp5l1n5zfv' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp4w2n0o5u.cpp -o /tmp/tmp5l1n5zfv/tmp/tmp4w2n0o5u.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_std1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp5l1n5zfv/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp4w2n0o5u.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/std_.hpp:5, from /tmp/tmp4w2n0o5u.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp4w2n0o5u.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________________ TestNumpyFunc0.test_std2 ___________________________ [gw6] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_std2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpfr23dk8m.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpfr23dk8m.cpp'], output_dir = '/tmp/tmprtpqcmkw' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmprtpqcmkw/tmp/tmpfr23dk8m.o', ('/tmp/tmpfr23dk8m.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmprtpqcmkw/tmp/tmpfr23dk8m.o', '/tmp/tmpfr23dk8m.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmprtpqcmkw/tmp/tmpfr23dk8m.o', src = '/tmp/tmpfr23dk8m.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfr23dk8m.cpp -o /tmp/tmprtpqcmkw/tmp/tmpfr23dk8m.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_std2', cxxfile = '/tmp/tmpfr23dk8m.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpecdorjbt', buildtmp = '/tmp/tmprtpqcmkw' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_std2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfr23dk8m.cpp -o /tmp/tmprtpqcmkw/tmp/tmpfr23dk8m.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_std2(self): > self.run_test("def np_std2(a): from numpy import std ; return std(a, 1)", numpy.array([[[1, 2], [3, 4]]]), np_std2=[NDArray[int, :, :, :]]) pythran/tests/test_numpy_func0.py:349: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_std2', cxxfile = '/tmp/tmpfr23dk8m.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpecdorjbt', buildtmp = '/tmp/tmprtpqcmkw' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfr23dk8m.cpp -o /tmp/tmprtpqcmkw/tmp/tmpfr23dk8m.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_std2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmprtpqcmkw/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpfr23dk8m.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/std_.hpp:5, from /tmp/tmpfr23dk8m.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpfr23dk8m.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________________ TestNumpyFunc0.test_std3 ___________________________ [gw6] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_std3', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpi8hho3k1.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpi8hho3k1.cpp'], output_dir = '/tmp/tmp8f1egy0g' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp8f1egy0g/tmp/tmpi8hho3k1.o', ('/tmp/tmpi8hho3k1.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp8f1egy0g/tmp/tmpi8hho3k1.o', '/tmp/tmpi8hho3k1.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp8f1egy0g/tmp/tmpi8hho3k1.o', src = '/tmp/tmpi8hho3k1.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpi8hho3k1.cpp -o /tmp/tmp8f1egy0g/tmp/tmpi8hho3k1.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_std3', cxxfile = '/tmp/tmpi8hho3k1.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpvll4suon', buildtmp = '/tmp/tmp8f1egy0g' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_std3', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpi8hho3k1.cpp -o /tmp/tmp8f1egy0g/tmp/tmpi8hho3k1.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_std3(self): > self.run_test("def np_std3(a): from numpy import std ; return std(1j*a, 1)", numpy.array([[[1, 2], [3, 4]]]), np_std3=[NDArray[int, :, :, :]]) pythran/tests/test_numpy_func0.py:352: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_std3', cxxfile = '/tmp/tmpi8hho3k1.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpvll4suon', buildtmp = '/tmp/tmp8f1egy0g' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpi8hho3k1.cpp -o /tmp/tmp8f1egy0g/tmp/tmpi8hho3k1.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_std3' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp8f1egy0g/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpi8hho3k1.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/std_.hpp:5, from /tmp/tmpi8hho3k1.cpp:15: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpi8hho3k1.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________________ TestNumpyFunc0.test_var0 ___________________________ [gw6] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpcjssj1rr.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpcjssj1rr.cpp'], output_dir = '/tmp/tmpzvrb8qsg' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpzvrb8qsg/tmp/tmpcjssj1rr.o', ('/tmp/tmpcjssj1rr.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpzvrb8qsg/tmp/tmpcjssj1rr.o', '/tmp/tmpcjssj1rr.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpzvrb8qsg/tmp/tmpcjssj1rr.o', src = '/tmp/tmpcjssj1rr.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpcjssj1rr.cpp -o /tmp/tmpzvrb8qsg/tmp/tmpcjssj1rr.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_var0', cxxfile = '/tmp/tmpcjssj1rr.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppkjqh6fq', buildtmp = '/tmp/tmpzvrb8qsg' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpcjssj1rr.cpp -o /tmp/tmpzvrb8qsg/tmp/tmpcjssj1rr.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_var0(self): > self.run_test("def np_var0(a): return a.var()", numpy.array([[1, 2], [3, 4]], dtype=float), np_var0=[NDArray[float,:,:]]) pythran/tests/test_numpy_func0.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_var0', cxxfile = '/tmp/tmpcjssj1rr.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppkjqh6fq', buildtmp = '/tmp/tmpzvrb8qsg' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpcjssj1rr.cpp -o /tmp/tmpzvrb8qsg/tmp/tmpcjssj1rr.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_var0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpzvrb8qsg/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpcjssj1rr.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpcjssj1rr.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpcjssj1rr.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________________ TestNumpyFunc0.test_var1 ___________________________ [gw6] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpzq_3mapo.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpzq_3mapo.cpp'], output_dir = '/tmp/tmplsl63kva' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmplsl63kva/tmp/tmpzq_3mapo.o', ('/tmp/tmpzq_3mapo.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmplsl63kva/tmp/tmpzq_3mapo.o', '/tmp/tmpzq_3mapo.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmplsl63kva/tmp/tmpzq_3mapo.o', src = '/tmp/tmpzq_3mapo.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzq_3mapo.cpp -o /tmp/tmplsl63kva/tmp/tmpzq_3mapo.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_var1', cxxfile = '/tmp/tmpzq_3mapo.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmphr08tcbp', buildtmp = '/tmp/tmplsl63kva' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzq_3mapo.cpp -o /tmp/tmplsl63kva/tmp/tmpzq_3mapo.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_var1(self): > self.run_test("def np_var1(a): from numpy import var ; return var(a, 1)", numpy.array([[1, 2], [3, 4.]]), np_var1=[NDArray[float,:,:]]) pythran/tests/test_numpy_func0.py:316: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_var1', cxxfile = '/tmp/tmpzq_3mapo.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmphr08tcbp', buildtmp = '/tmp/tmplsl63kva' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzq_3mapo.cpp -o /tmp/tmplsl63kva/tmp/tmpzq_3mapo.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_var1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmplsl63kva/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpzq_3mapo.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpzq_3mapo.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpzq_3mapo.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________________ TestNumpyFunc0.test_var2 ___________________________ [gw6] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp50imfcop.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp50imfcop.cpp'], output_dir = '/tmp/tmpq8fnsdn9' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpq8fnsdn9/tmp/tmp50imfcop.o', ('/tmp/tmp50imfcop.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpq8fnsdn9/tmp/tmp50imfcop.o', '/tmp/tmp50imfcop.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpq8fnsdn9/tmp/tmp50imfcop.o', src = '/tmp/tmp50imfcop.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp50imfcop.cpp -o /tmp/tmpq8fnsdn9/tmp/tmp50imfcop.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_var2', cxxfile = '/tmp/tmp50imfcop.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpvbcf9v30', buildtmp = '/tmp/tmpq8fnsdn9' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_var2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp50imfcop.cpp -o /tmp/tmpq8fnsdn9/tmp/tmp50imfcop.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_var2(self): > self.run_test("def np_var2(a): from numpy import var ; return var(a)", numpy.array([[[1, 2], [3, 4.]]]), np_var2=[NDArray[float,:,:,:]]) pythran/tests/test_numpy_func0.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_var2', cxxfile = '/tmp/tmp50imfcop.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpvbcf9v30', buildtmp = '/tmp/tmpq8fnsdn9' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp50imfcop.cpp -o /tmp/tmpq8fnsdn9/tmp/tmp50imfcop.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_var2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpq8fnsdn9/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp50imfcop.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp50imfcop.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp50imfcop.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ________________________ TestNumpyFunc2.test_convolve_1 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpsslfosmk.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpsslfosmk.cpp'], output_dir = '/tmp/tmpyjtqyz2i' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpyjtqyz2i/tmp/tmpsslfosmk.o', ('/tmp/tmpsslfosmk.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpyjtqyz2i/tmp/tmpsslfosmk.o', '/tmp/tmpsslfosmk.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpyjtqyz2i/tmp/tmpsslfosmk.o', src = '/tmp/tmpsslfosmk.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpsslfosmk.cpp -o /tmp/tmpyjtqyz2i/tmp/tmpsslfosmk.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_convolve_1', cxxfile = '/tmp/tmpsslfosmk.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpkiqkf9jz', buildtmp = '/tmp/tmpyjtqyz2i' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpsslfosmk.cpp -o /tmp/tmpyjtqyz2i/tmp/tmpsslfosmk.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_convolve_1(self): > self.run_test("def np_convolve_1(a,b):\n from numpy import convolve\n return convolve(a,b)", numpy.arange(10,dtype=float), numpy.arange(12,dtype=float), np_convolve_1=[NDArray[float,:],NDArray[float,:]]) pythran/tests/test_numpy_func2.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_convolve_1', cxxfile = '/tmp/tmpsslfosmk.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpkiqkf9jz', buildtmp = '/tmp/tmpyjtqyz2i' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpsslfosmk.cpp -o /tmp/tmpyjtqyz2i/tmp/tmpsslfosmk.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_convolve_1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpyjtqyz2i/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpsslfosmk.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/convolve.hpp:5, from /tmp/tmpsslfosmk.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpsslfosmk.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________________ TestNumpyFunc2.test_convolve_10 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_10', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpc9rrndpg.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpc9rrndpg.cpp'], output_dir = '/tmp/tmpz8bz7yzx' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpz8bz7yzx/tmp/tmpc9rrndpg.o', ('/tmp/tmpc9rrndpg.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpz8bz7yzx/tmp/tmpc9rrndpg.o', '/tmp/tmpc9rrndpg.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpz8bz7yzx/tmp/tmpc9rrndpg.o', src = '/tmp/tmpc9rrndpg.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpc9rrndpg.cpp -o /tmp/tmpz8bz7yzx/tmp/tmpc9rrndpg.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_convolve_10', cxxfile = '/tmp/tmpc9rrndpg.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpjpnbaou_', buildtmp = '/tmp/tmpz8bz7yzx' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_10', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpc9rrndpg.cpp -o /tmp/tmpz8bz7yzx/tmp/tmpc9rrndpg.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_convolve_10(self): > self.run_test("def np_convolve_10(a,b):\n from numpy import convolve\n return convolve(a,b,'same')", numpy.arange(12,dtype=float), numpy.arange(7,dtype=numpy.float32), np_convolve_10=[NDArray[float,:],NDArray[numpy.float32,:]]) pythran/tests/test_numpy_func2.py:252: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_convolve_10', cxxfile = '/tmp/tmpc9rrndpg.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpjpnbaou_', buildtmp = '/tmp/tmpz8bz7yzx' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpc9rrndpg.cpp -o /tmp/tmpz8bz7yzx/tmp/tmpc9rrndpg.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_convolve_10' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpz8bz7yzx/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpc9rrndpg.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/convolve.hpp:5, from /tmp/tmpc9rrndpg.cpp:16: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpc9rrndpg.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________________ TestNumpyFunc2.test_convolve_11 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_11', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp9rakcx7j.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp9rakcx7j.cpp'], output_dir = '/tmp/tmpqs9au5pe' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpqs9au5pe/tmp/tmp9rakcx7j.o', ('/tmp/tmp9rakcx7j.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpqs9au5pe/tmp/tmp9rakcx7j.o', '/tmp/tmp9rakcx7j.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpqs9au5pe/tmp/tmp9rakcx7j.o', src = '/tmp/tmp9rakcx7j.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9rakcx7j.cpp -o /tmp/tmpqs9au5pe/tmp/tmp9rakcx7j.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_convolve_11', cxxfile = '/tmp/tmp9rakcx7j.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpa7at8g74', buildtmp = '/tmp/tmpqs9au5pe' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_11', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9rakcx7j.cpp -o /tmp/tmpqs9au5pe/tmp/tmp9rakcx7j.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_convolve_11(self): > self.run_test("def np_convolve_11(a,b):\n from numpy import convolve\n return convolve(a,b,'same')", numpy.arange(12,dtype=numpy.float32), numpy.arange(7,dtype=float), np_convolve_11=[NDArray[numpy.float32,:],NDArray[float,:]]) pythran/tests/test_numpy_func2.py:258: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_convolve_11', cxxfile = '/tmp/tmp9rakcx7j.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpa7at8g74', buildtmp = '/tmp/tmpqs9au5pe' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9rakcx7j.cpp -o /tmp/tmpqs9au5pe/tmp/tmp9rakcx7j.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_convolve_11' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpqs9au5pe/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp9rakcx7j.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/convolve.hpp:5, from /tmp/tmp9rakcx7j.cpp:16: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp9rakcx7j.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ________________________ TestNumpyFunc2.test_convolve_2 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmptljh3ltz.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmptljh3ltz.cpp'], output_dir = '/tmp/tmpmhcv438j' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpmhcv438j/tmp/tmptljh3ltz.o', ('/tmp/tmptljh3ltz.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpmhcv438j/tmp/tmptljh3ltz.o', '/tmp/tmptljh3ltz.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpmhcv438j/tmp/tmptljh3ltz.o', src = '/tmp/tmptljh3ltz.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmptljh3ltz.cpp -o /tmp/tmpmhcv438j/tmp/tmptljh3ltz.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_convolve_2', cxxfile = '/tmp/tmptljh3ltz.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpqunxke3h', buildtmp = '/tmp/tmpmhcv438j' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmptljh3ltz.cpp -o /tmp/tmpmhcv438j/tmp/tmptljh3ltz.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_convolve_2(self): > self.run_test("def np_convolve_2(a,b):\n from numpy import convolve\n return convolve(a,b)", numpy.arange(12,dtype=float), numpy.arange(10,dtype=float), np_convolve_2=[NDArray[float,:],NDArray[float,:]]) pythran/tests/test_numpy_func2.py:201: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_convolve_2', cxxfile = '/tmp/tmptljh3ltz.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpqunxke3h', buildtmp = '/tmp/tmpmhcv438j' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmptljh3ltz.cpp -o /tmp/tmpmhcv438j/tmp/tmptljh3ltz.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_convolve_2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpmhcv438j/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmptljh3ltz.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/convolve.hpp:5, from /tmp/tmptljh3ltz.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmptljh3ltz.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ________________________ TestNumpyFunc2.test_convolve_3 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_3', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp3b76pm26.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp3b76pm26.cpp'], output_dir = '/tmp/tmph46u5ouy' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmph46u5ouy/tmp/tmp3b76pm26.o', ('/tmp/tmp3b76pm26.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmph46u5ouy/tmp/tmp3b76pm26.o', '/tmp/tmp3b76pm26.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmph46u5ouy/tmp/tmp3b76pm26.o', src = '/tmp/tmp3b76pm26.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp3b76pm26.cpp -o /tmp/tmph46u5ouy/tmp/tmp3b76pm26.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_convolve_3', cxxfile = '/tmp/tmp3b76pm26.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpg1omdmng', buildtmp = '/tmp/tmph46u5ouy' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_3', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp3b76pm26.cpp -o /tmp/tmph46u5ouy/tmp/tmp3b76pm26.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_convolve_3(self): > self.run_test("def np_convolve_3(a,b):\n from numpy import convolve\n return convolve(a,b,'valid')", numpy.arange(12,dtype=float), numpy.arange(10,dtype=float), np_convolve_3=[NDArray[float,:],NDArray[float,:]]) pythran/tests/test_numpy_func2.py:207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_convolve_3', cxxfile = '/tmp/tmp3b76pm26.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpg1omdmng', buildtmp = '/tmp/tmph46u5ouy' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp3b76pm26.cpp -o /tmp/tmph46u5ouy/tmp/tmp3b76pm26.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_convolve_3' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmph46u5ouy/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp3b76pm26.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/convolve.hpp:5, from /tmp/tmp3b76pm26.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp3b76pm26.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ________________________ TestNumpyFunc2.test_convolve_4 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_4', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpeensuloy.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpeensuloy.cpp'], output_dir = '/tmp/tmp9auk95gg' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp9auk95gg/tmp/tmpeensuloy.o', ('/tmp/tmpeensuloy.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp9auk95gg/tmp/tmpeensuloy.o', '/tmp/tmpeensuloy.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp9auk95gg/tmp/tmpeensuloy.o', src = '/tmp/tmpeensuloy.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpeensuloy.cpp -o /tmp/tmp9auk95gg/tmp/tmpeensuloy.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_convolve_4', cxxfile = '/tmp/tmpeensuloy.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpbpjejhk4', buildtmp = '/tmp/tmp9auk95gg' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_4', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpeensuloy.cpp -o /tmp/tmp9auk95gg/tmp/tmpeensuloy.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_convolve_4(self): > self.run_test("def np_convolve_4(a,b):\n from numpy import convolve\n return convolve(a,b,'same')", numpy.arange(12,dtype=float), numpy.arange(10,dtype=float), np_convolve_4=[NDArray[float,:],NDArray[float,:]]) pythran/tests/test_numpy_func2.py:213: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_convolve_4', cxxfile = '/tmp/tmpeensuloy.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpbpjejhk4', buildtmp = '/tmp/tmp9auk95gg' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpeensuloy.cpp -o /tmp/tmp9auk95gg/tmp/tmpeensuloy.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_convolve_4' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp9auk95gg/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpeensuloy.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/convolve.hpp:5, from /tmp/tmpeensuloy.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpeensuloy.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ________________________ TestNumpyFunc2.test_convolve_5 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_5', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpkcbz8hvv.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpkcbz8hvv.cpp'], output_dir = '/tmp/tmp_o9ly8eh' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp_o9ly8eh/tmp/tmpkcbz8hvv.o', ('/tmp/tmpkcbz8hvv.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp_o9ly8eh/tmp/tmpkcbz8hvv.o', '/tmp/tmpkcbz8hvv.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp_o9ly8eh/tmp/tmpkcbz8hvv.o', src = '/tmp/tmpkcbz8hvv.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpkcbz8hvv.cpp -o /tmp/tmp_o9ly8eh/tmp/tmpkcbz8hvv.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_convolve_5', cxxfile = '/tmp/tmpkcbz8hvv.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp7ntzj0u0', buildtmp = '/tmp/tmp_o9ly8eh' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_5', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpkcbz8hvv.cpp -o /tmp/tmp_o9ly8eh/tmp/tmpkcbz8hvv.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_convolve_5(self): > self.run_test("def np_convolve_5(a,b):\n from numpy import convolve\n return convolve(a,b,'same')", numpy.arange(12,dtype=float), numpy.arange(7,dtype=float), np_convolve_5=[NDArray[float,:],NDArray[float,:]]) pythran/tests/test_numpy_func2.py:219: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_convolve_5', cxxfile = '/tmp/tmpkcbz8hvv.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp7ntzj0u0', buildtmp = '/tmp/tmp_o9ly8eh' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpkcbz8hvv.cpp -o /tmp/tmp_o9ly8eh/tmp/tmpkcbz8hvv.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_convolve_5' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp_o9ly8eh/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpkcbz8hvv.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/convolve.hpp:5, from /tmp/tmpkcbz8hvv.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpkcbz8hvv.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ________________________ TestNumpyFunc2.test_convolve_6 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_6', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpykqrqzws.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpykqrqzws.cpp'], output_dir = '/tmp/tmpe2g_akb1' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpe2g_akb1/tmp/tmpykqrqzws.o', ('/tmp/tmpykqrqzws.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpe2g_akb1/tmp/tmpykqrqzws.o', '/tmp/tmpykqrqzws.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpe2g_akb1/tmp/tmpykqrqzws.o', src = '/tmp/tmpykqrqzws.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpykqrqzws.cpp -o /tmp/tmpe2g_akb1/tmp/tmpykqrqzws.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_convolve_6', cxxfile = '/tmp/tmpykqrqzws.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpfj0ig_b_', buildtmp = '/tmp/tmpe2g_akb1' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_6', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpykqrqzws.cpp -o /tmp/tmpe2g_akb1/tmp/tmpykqrqzws.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_convolve_6(self): > self.run_test("def np_convolve_6(a,b):\n from numpy import convolve\n return convolve(a,b,'full')", numpy.arange(12.) + 1j*numpy.arange(12.), numpy.arange(7.) + 1j* numpy.arange(7.), np_convolve_6=[NDArray[complex,:],NDArray[complex,:]]) pythran/tests/test_numpy_func2.py:225: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_convolve_6', cxxfile = '/tmp/tmpykqrqzws.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpfj0ig_b_', buildtmp = '/tmp/tmpe2g_akb1' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpykqrqzws.cpp -o /tmp/tmpe2g_akb1/tmp/tmpykqrqzws.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_convolve_6' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpe2g_akb1/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpykqrqzws.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/convolve.hpp:5, from /tmp/tmpykqrqzws.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpykqrqzws.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ________________________ TestNumpyFunc2.test_convolve_7 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_7', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpx2vkj2m7.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpx2vkj2m7.cpp'], output_dir = '/tmp/tmptwxocyhc' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmptwxocyhc/tmp/tmpx2vkj2m7.o', ('/tmp/tmpx2vkj2m7.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmptwxocyhc/tmp/tmpx2vkj2m7.o', '/tmp/tmpx2vkj2m7.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmptwxocyhc/tmp/tmpx2vkj2m7.o', src = '/tmp/tmpx2vkj2m7.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpx2vkj2m7.cpp -o /tmp/tmptwxocyhc/tmp/tmpx2vkj2m7.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_convolve_7', cxxfile = '/tmp/tmpx2vkj2m7.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp81e3a8ex', buildtmp = '/tmp/tmptwxocyhc' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_7', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpx2vkj2m7.cpp -o /tmp/tmptwxocyhc/tmp/tmpx2vkj2m7.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_convolve_7(self): dtype = numpy.float32 > self.run_test("def np_convolve_7(a,b):\n from numpy import convolve\n return convolve(a,b,'full')", numpy.arange(12).astype(dtype) + 1j*numpy.arange(12).astype(dtype), numpy.arange(7).astype(dtype) + 1j*numpy.arange(7).astype(dtype), np_convolve_7=[NDArray[numpy.complex64,:],NDArray[numpy.complex64,:]]) pythran/tests/test_numpy_func2.py:232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_convolve_7', cxxfile = '/tmp/tmpx2vkj2m7.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp81e3a8ex', buildtmp = '/tmp/tmptwxocyhc' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpx2vkj2m7.cpp -o /tmp/tmptwxocyhc/tmp/tmpx2vkj2m7.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_convolve_7' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmptwxocyhc/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpx2vkj2m7.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/convolve.hpp:5, from /tmp/tmpx2vkj2m7.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpx2vkj2m7.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ________________________ TestNumpyFunc2.test_convolve_8 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_8', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpi_nalyim.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpi_nalyim.cpp'], output_dir = '/tmp/tmpuo6leeh0' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpuo6leeh0/tmp/tmpi_nalyim.o', ('/tmp/tmpi_nalyim.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpuo6leeh0/tmp/tmpi_nalyim.o', '/tmp/tmpi_nalyim.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpuo6leeh0/tmp/tmpi_nalyim.o', src = '/tmp/tmpi_nalyim.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpi_nalyim.cpp -o /tmp/tmpuo6leeh0/tmp/tmpi_nalyim.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_convolve_8', cxxfile = '/tmp/tmpi_nalyim.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp7jhlg0wq', buildtmp = '/tmp/tmpuo6leeh0' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_8', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpi_nalyim.cpp -o /tmp/tmpuo6leeh0/tmp/tmpi_nalyim.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_convolve_8(self): dtype = numpy.float32 > self.run_test("def np_convolve_8(a,b):\n from numpy import convolve\n return convolve(a,b,'full')", numpy.arange(7).astype(dtype) + 1j*numpy.arange(7).astype(dtype), numpy.arange(12).astype(dtype) +1j*numpy.arange(12).astype(dtype), np_convolve_8=[NDArray[numpy.complex64,:],NDArray[numpy.complex64,:]]) pythran/tests/test_numpy_func2.py:239: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_convolve_8', cxxfile = '/tmp/tmpi_nalyim.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp7jhlg0wq', buildtmp = '/tmp/tmpuo6leeh0' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpi_nalyim.cpp -o /tmp/tmpuo6leeh0/tmp/tmpi_nalyim.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_convolve_8' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpuo6leeh0/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpi_nalyim.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/convolve.hpp:5, from /tmp/tmpi_nalyim.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpi_nalyim.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ________________________ TestNumpyFunc2.test_convolve_9 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_9', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpt1vk4fl0.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpt1vk4fl0.cpp'], output_dir = '/tmp/tmpoagbreyn' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpoagbreyn/tmp/tmpt1vk4fl0.o', ('/tmp/tmpt1vk4fl0.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpoagbreyn/tmp/tmpt1vk4fl0.o', '/tmp/tmpt1vk4fl0.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpoagbreyn/tmp/tmpt1vk4fl0.o', src = '/tmp/tmpt1vk4fl0.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpt1vk4fl0.cpp -o /tmp/tmpoagbreyn/tmp/tmpt1vk4fl0.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_convolve_9', cxxfile = '/tmp/tmpt1vk4fl0.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpcwzgz3bt', buildtmp = '/tmp/tmpoagbreyn' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_convolve_9', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpt1vk4fl0.cpp -o /tmp/tmpoagbreyn/tmp/tmpt1vk4fl0.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_convolve_9(self): dtype = numpy.float > self.run_test("def np_convolve_9(a,b):\n from numpy import convolve\n return convolve(a,b,'full')", numpy.arange(7).astype(dtype) + 1j* numpy.arange(7).astype(dtype), numpy.arange(12).astype(dtype) + 1j* numpy.arange(12).astype(dtype), np_convolve_9=[NDArray[numpy.complex128,:],NDArray[numpy.complex128,:]]) pythran/tests/test_numpy_func2.py:246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_convolve_9', cxxfile = '/tmp/tmpt1vk4fl0.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpcwzgz3bt', buildtmp = '/tmp/tmpoagbreyn' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpt1vk4fl0.cpp -o /tmp/tmpoagbreyn/tmp/tmpt1vk4fl0.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_convolve_9' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpoagbreyn/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpt1vk4fl0.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/convolve.hpp:5, from /tmp/tmpt1vk4fl0.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpt1vk4fl0.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________________ TestNumpyFunc2.test_correlate_1 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpuvib9fun.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpuvib9fun.cpp'], output_dir = '/tmp/tmpi2w0kij5' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpi2w0kij5/tmp/tmpuvib9fun.o', ('/tmp/tmpuvib9fun.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpi2w0kij5/tmp/tmpuvib9fun.o', '/tmp/tmpuvib9fun.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpi2w0kij5/tmp/tmpuvib9fun.o', src = '/tmp/tmpuvib9fun.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpuvib9fun.cpp -o /tmp/tmpi2w0kij5/tmp/tmpuvib9fun.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_correlate_1', cxxfile = '/tmp/tmpuvib9fun.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp0u_50fvq', buildtmp = '/tmp/tmpi2w0kij5' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpuvib9fun.cpp -o /tmp/tmpi2w0kij5/tmp/tmpuvib9fun.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_correlate_1(self): > self.run_test("def np_correlate_1(a,b):\n from numpy import correlate\n return correlate(a,b)", numpy.arange(10,dtype=float), numpy.arange(12,dtype=float), np_correlate_1=[NDArray[float,:],NDArray[float,:]]) pythran/tests/test_numpy_func2.py:128: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_correlate_1', cxxfile = '/tmp/tmpuvib9fun.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp0u_50fvq', buildtmp = '/tmp/tmpi2w0kij5' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpuvib9fun.cpp -o /tmp/tmpi2w0kij5/tmp/tmpuvib9fun.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_correlate_1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpi2w0kij5/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpuvib9fun.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /tmp/tmpuvib9fun.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpuvib9fun.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________________ TestNumpyFunc2.test_correlate_10 _______________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_10', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpb0pl2dq_.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpb0pl2dq_.cpp'], output_dir = '/tmp/tmpj1qdhzk4' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpj1qdhzk4/tmp/tmpb0pl2dq_.o', ('/tmp/tmpb0pl2dq_.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpj1qdhzk4/tmp/tmpb0pl2dq_.o', '/tmp/tmpb0pl2dq_.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpj1qdhzk4/tmp/tmpb0pl2dq_.o', src = '/tmp/tmpb0pl2dq_.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpb0pl2dq_.cpp -o /tmp/tmpj1qdhzk4/tmp/tmpb0pl2dq_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_correlate_10', cxxfile = '/tmp/tmpb0pl2dq_.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppb9ffmy1', buildtmp = '/tmp/tmpj1qdhzk4' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_10', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpb0pl2dq_.cpp -o /tmp/tmpj1qdhzk4/tmp/tmpb0pl2dq_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_correlate_10(self): > self.run_test("def np_correlate_10(a,b):\n from numpy import correlate\n return correlate(a,b,'same')", numpy.arange(12,dtype=float), numpy.arange(7,dtype=numpy.float32), np_correlate_10=[NDArray[float,:],NDArray[numpy.float32,:]]) pythran/tests/test_numpy_func2.py:184: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_correlate_10', cxxfile = '/tmp/tmpb0pl2dq_.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppb9ffmy1', buildtmp = '/tmp/tmpj1qdhzk4' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpb0pl2dq_.cpp -o /tmp/tmpj1qdhzk4/tmp/tmpb0pl2dq_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_correlate_10' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpj1qdhzk4/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpb0pl2dq_.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /tmp/tmpb0pl2dq_.cpp:16: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpb0pl2dq_.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________________ TestNumpyFunc2.test_correlate_11 _______________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_11', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpnfp3rchj.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpnfp3rchj.cpp'], output_dir = '/tmp/tmpxthez34c' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpxthez34c/tmp/tmpnfp3rchj.o', ('/tmp/tmpnfp3rchj.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpxthez34c/tmp/tmpnfp3rchj.o', '/tmp/tmpnfp3rchj.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpxthez34c/tmp/tmpnfp3rchj.o', src = '/tmp/tmpnfp3rchj.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpnfp3rchj.cpp -o /tmp/tmpxthez34c/tmp/tmpnfp3rchj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_correlate_11', cxxfile = '/tmp/tmpnfp3rchj.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpjeqhufk_', buildtmp = '/tmp/tmpxthez34c' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_11', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpnfp3rchj.cpp -o /tmp/tmpxthez34c/tmp/tmpnfp3rchj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_correlate_11(self): > self.run_test("def np_correlate_11(a,b):\n from numpy import correlate\n return correlate(a,b,'same')", numpy.arange(12,dtype=numpy.float32), numpy.arange(7,dtype=float), np_correlate_11=[NDArray[numpy.float32,:],NDArray[float,:]]) pythran/tests/test_numpy_func2.py:190: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_correlate_11', cxxfile = '/tmp/tmpnfp3rchj.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpjeqhufk_', buildtmp = '/tmp/tmpxthez34c' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpnfp3rchj.cpp -o /tmp/tmpxthez34c/tmp/tmpnfp3rchj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_correlate_11' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpxthez34c/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpnfp3rchj.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /tmp/tmpnfp3rchj.cpp:16: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpnfp3rchj.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________________ TestNumpyFunc2.test_correlate_2 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpgpc2fm75.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpgpc2fm75.cpp'], output_dir = '/tmp/tmpjha699oq' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpjha699oq/tmp/tmpgpc2fm75.o', ('/tmp/tmpgpc2fm75.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpjha699oq/tmp/tmpgpc2fm75.o', '/tmp/tmpgpc2fm75.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpjha699oq/tmp/tmpgpc2fm75.o', src = '/tmp/tmpgpc2fm75.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpgpc2fm75.cpp -o /tmp/tmpjha699oq/tmp/tmpgpc2fm75.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_correlate_2', cxxfile = '/tmp/tmpgpc2fm75.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpa05xgczu', buildtmp = '/tmp/tmpjha699oq' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpgpc2fm75.cpp -o /tmp/tmpjha699oq/tmp/tmpgpc2fm75.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_correlate_2(self): > self.run_test("def np_correlate_2(a,b):\n from numpy import correlate\n return correlate(a,b)", numpy.arange(12,dtype=float), numpy.arange(10,dtype=float), np_correlate_2=[NDArray[float,:],NDArray[float,:]]) pythran/tests/test_numpy_func2.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_correlate_2', cxxfile = '/tmp/tmpgpc2fm75.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpa05xgczu', buildtmp = '/tmp/tmpjha699oq' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpgpc2fm75.cpp -o /tmp/tmpjha699oq/tmp/tmpgpc2fm75.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_correlate_2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpjha699oq/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpgpc2fm75.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /tmp/tmpgpc2fm75.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpgpc2fm75.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________________ TestNumpyFunc2.test_correlate_3 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_3', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp0k_ka64y.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp0k_ka64y.cpp'], output_dir = '/tmp/tmpv7iao0zm' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpv7iao0zm/tmp/tmp0k_ka64y.o', ('/tmp/tmp0k_ka64y.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpv7iao0zm/tmp/tmp0k_ka64y.o', '/tmp/tmp0k_ka64y.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpv7iao0zm/tmp/tmp0k_ka64y.o', src = '/tmp/tmp0k_ka64y.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp0k_ka64y.cpp -o /tmp/tmpv7iao0zm/tmp/tmp0k_ka64y.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_correlate_3', cxxfile = '/tmp/tmp0k_ka64y.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpf0mtn1qy', buildtmp = '/tmp/tmpv7iao0zm' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_3', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp0k_ka64y.cpp -o /tmp/tmpv7iao0zm/tmp/tmp0k_ka64y.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_correlate_3(self): > self.run_test("def np_correlate_3(a,b):\n from numpy import correlate\n return correlate(a,b,'valid')", numpy.arange(12,dtype=float), numpy.arange(10,dtype=float), np_correlate_3=[NDArray[float,:],NDArray[float,:]]) pythran/tests/test_numpy_func2.py:139: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_correlate_3', cxxfile = '/tmp/tmp0k_ka64y.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpf0mtn1qy', buildtmp = '/tmp/tmpv7iao0zm' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp0k_ka64y.cpp -o /tmp/tmpv7iao0zm/tmp/tmp0k_ka64y.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_correlate_3' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpv7iao0zm/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp0k_ka64y.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /tmp/tmp0k_ka64y.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp0k_ka64y.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________________ TestNumpyFunc2.test_correlate_4 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_4', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpxsi98kgj.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpxsi98kgj.cpp'], output_dir = '/tmp/tmpwfeo2ljz' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpwfeo2ljz/tmp/tmpxsi98kgj.o', ('/tmp/tmpxsi98kgj.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpwfeo2ljz/tmp/tmpxsi98kgj.o', '/tmp/tmpxsi98kgj.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpwfeo2ljz/tmp/tmpxsi98kgj.o', src = '/tmp/tmpxsi98kgj.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxsi98kgj.cpp -o /tmp/tmpwfeo2ljz/tmp/tmpxsi98kgj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_correlate_4', cxxfile = '/tmp/tmpxsi98kgj.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpgn9nuv1y', buildtmp = '/tmp/tmpwfeo2ljz' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_4', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxsi98kgj.cpp -o /tmp/tmpwfeo2ljz/tmp/tmpxsi98kgj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_correlate_4(self): > self.run_test("def np_correlate_4(a,b):\n from numpy import correlate\n return correlate(a,b,'same')", numpy.arange(12,dtype=float), numpy.arange(10,dtype=float), np_correlate_4=[NDArray[float,:],NDArray[float,:]]) pythran/tests/test_numpy_func2.py:145: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_correlate_4', cxxfile = '/tmp/tmpxsi98kgj.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpgn9nuv1y', buildtmp = '/tmp/tmpwfeo2ljz' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxsi98kgj.cpp -o /tmp/tmpwfeo2ljz/tmp/tmpxsi98kgj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_correlate_4' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpwfeo2ljz/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpxsi98kgj.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /tmp/tmpxsi98kgj.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpxsi98kgj.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________________ TestNumpyFunc2.test_correlate_5 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_5', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpjyahi9ww.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpjyahi9ww.cpp'], output_dir = '/tmp/tmpxuknp5l_' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpxuknp5l_/tmp/tmpjyahi9ww.o', ('/tmp/tmpjyahi9ww.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpxuknp5l_/tmp/tmpjyahi9ww.o', '/tmp/tmpjyahi9ww.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpxuknp5l_/tmp/tmpjyahi9ww.o', src = '/tmp/tmpjyahi9ww.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpjyahi9ww.cpp -o /tmp/tmpxuknp5l_/tmp/tmpjyahi9ww.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_correlate_5', cxxfile = '/tmp/tmpjyahi9ww.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp94j79305', buildtmp = '/tmp/tmpxuknp5l_' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_5', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpjyahi9ww.cpp -o /tmp/tmpxuknp5l_/tmp/tmpjyahi9ww.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_correlate_5(self): > self.run_test("def np_correlate_5(a,b):\n from numpy import correlate\n return correlate(a,b,'same')", numpy.arange(12,dtype=float), numpy.arange(7,dtype=float), np_correlate_5=[NDArray[float,:],NDArray[float,:]]) pythran/tests/test_numpy_func2.py:151: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_correlate_5', cxxfile = '/tmp/tmpjyahi9ww.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp94j79305', buildtmp = '/tmp/tmpxuknp5l_' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpjyahi9ww.cpp -o /tmp/tmpxuknp5l_/tmp/tmpjyahi9ww.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_correlate_5' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpxuknp5l_/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpjyahi9ww.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /tmp/tmpjyahi9ww.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpjyahi9ww.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________________ TestNumpyFunc2.test_correlate_6 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_6', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpzwmvkv7i.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpzwmvkv7i.cpp'], output_dir = '/tmp/tmprxb8vv3h' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmprxb8vv3h/tmp/tmpzwmvkv7i.o', ('/tmp/tmpzwmvkv7i.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmprxb8vv3h/tmp/tmpzwmvkv7i.o', '/tmp/tmpzwmvkv7i.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmprxb8vv3h/tmp/tmpzwmvkv7i.o', src = '/tmp/tmpzwmvkv7i.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzwmvkv7i.cpp -o /tmp/tmprxb8vv3h/tmp/tmpzwmvkv7i.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_correlate_6', cxxfile = '/tmp/tmpzwmvkv7i.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppgrvioqk', buildtmp = '/tmp/tmprxb8vv3h' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_6', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzwmvkv7i.cpp -o /tmp/tmprxb8vv3h/tmp/tmpzwmvkv7i.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_correlate_6(self): > self.run_test("def np_correlate_6(a,b):\n from numpy import correlate\n return correlate(a,b,'full')", numpy.arange(12) + 1j*numpy.arange(12.), numpy.arange(7.) + 1j* numpy.arange(7.), np_correlate_6=[NDArray[complex,:],NDArray[complex,:]]) pythran/tests/test_numpy_func2.py:157: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_correlate_6', cxxfile = '/tmp/tmpzwmvkv7i.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppgrvioqk', buildtmp = '/tmp/tmprxb8vv3h' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzwmvkv7i.cpp -o /tmp/tmprxb8vv3h/tmp/tmpzwmvkv7i.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_correlate_6' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmprxb8vv3h/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpzwmvkv7i.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /tmp/tmpzwmvkv7i.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpzwmvkv7i.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________________ TestNumpyFunc2.test_correlate_7 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_7', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpwyr47ksj.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpwyr47ksj.cpp'], output_dir = '/tmp/tmpjugm76aw' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpjugm76aw/tmp/tmpwyr47ksj.o', ('/tmp/tmpwyr47ksj.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpjugm76aw/tmp/tmpwyr47ksj.o', '/tmp/tmpwyr47ksj.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpjugm76aw/tmp/tmpwyr47ksj.o', src = '/tmp/tmpwyr47ksj.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpwyr47ksj.cpp -o /tmp/tmpjugm76aw/tmp/tmpwyr47ksj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_correlate_7', cxxfile = '/tmp/tmpwyr47ksj.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpbsrszw7o', buildtmp = '/tmp/tmpjugm76aw' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_7', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpwyr47ksj.cpp -o /tmp/tmpjugm76aw/tmp/tmpwyr47ksj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_correlate_7(self): dtype = numpy.float32 > self.run_test("def np_correlate_7(a,b):\n from numpy import correlate\n return correlate(a,b,'full')", numpy.arange(12).astype(dtype) + 1j*numpy.arange(12).astype(dtype), numpy.arange(7).astype(dtype) + 1j* numpy.arange(7).astype(dtype), np_correlate_7=[NDArray[numpy.complex64,:],NDArray[numpy.complex64,:]]) pythran/tests/test_numpy_func2.py:164: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_correlate_7', cxxfile = '/tmp/tmpwyr47ksj.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpbsrszw7o', buildtmp = '/tmp/tmpjugm76aw' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpwyr47ksj.cpp -o /tmp/tmpjugm76aw/tmp/tmpwyr47ksj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_correlate_7' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpjugm76aw/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpwyr47ksj.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /tmp/tmpwyr47ksj.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpwyr47ksj.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________________ TestNumpyFunc2.test_correlate_8 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_8', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpely0h31r.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpely0h31r.cpp'], output_dir = '/tmp/tmp2dg3nv6f' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp2dg3nv6f/tmp/tmpely0h31r.o', ('/tmp/tmpely0h31r.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp2dg3nv6f/tmp/tmpely0h31r.o', '/tmp/tmpely0h31r.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp2dg3nv6f/tmp/tmpely0h31r.o', src = '/tmp/tmpely0h31r.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpely0h31r.cpp -o /tmp/tmp2dg3nv6f/tmp/tmpely0h31r.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_correlate_8', cxxfile = '/tmp/tmpely0h31r.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpisxa4k1z', buildtmp = '/tmp/tmp2dg3nv6f' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_8', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpely0h31r.cpp -o /tmp/tmp2dg3nv6f/tmp/tmpely0h31r.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_correlate_8(self): dtype = numpy.float32 > self.run_test("def np_correlate_8(a,b):\n from numpy import correlate\n return correlate(a,b,'full')", numpy.arange(7).astype(dtype) + 1j*numpy.arange(7).astype(dtype), numpy.arange(12).astype(dtype) + 1j*numpy.arange(12).astype(dtype), np_correlate_8=[NDArray[numpy.complex64,:],NDArray[numpy.complex64,:]]) pythran/tests/test_numpy_func2.py:171: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_correlate_8', cxxfile = '/tmp/tmpely0h31r.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpisxa4k1z', buildtmp = '/tmp/tmp2dg3nv6f' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpely0h31r.cpp -o /tmp/tmp2dg3nv6f/tmp/tmpely0h31r.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_correlate_8' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp2dg3nv6f/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpely0h31r.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /tmp/tmpely0h31r.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpely0h31r.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________________ TestNumpyFunc2.test_correlate_9 ________________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_9', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmphnqxgfg1.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmphnqxgfg1.cpp'], output_dir = '/tmp/tmpazf3hiow' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpazf3hiow/tmp/tmphnqxgfg1.o', ('/tmp/tmphnqxgfg1.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpazf3hiow/tmp/tmphnqxgfg1.o', '/tmp/tmphnqxgfg1.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpazf3hiow/tmp/tmphnqxgfg1.o', src = '/tmp/tmphnqxgfg1.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphnqxgfg1.cpp -o /tmp/tmpazf3hiow/tmp/tmphnqxgfg1.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_correlate_9', cxxfile = '/tmp/tmphnqxgfg1.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp2css0677', buildtmp = '/tmp/tmpazf3hiow' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_correlate_9', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphnqxgfg1.cpp -o /tmp/tmpazf3hiow/tmp/tmphnqxgfg1.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_correlate_9(self): dtype = numpy.float > self.run_test("def np_correlate_9(a,b):\n from numpy import correlate\n return correlate(a,b,'full')", numpy.arange(7).astype(dtype) + 1j*numpy.arange(7).astype(dtype), numpy.arange(12).astype(dtype) + 1j*numpy.arange(12).astype(dtype), np_correlate_9=[NDArray[numpy.complex128,:],NDArray[numpy.complex128,:]]) pythran/tests/test_numpy_func2.py:178: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_correlate_9', cxxfile = '/tmp/tmphnqxgfg1.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp2css0677', buildtmp = '/tmp/tmpazf3hiow' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphnqxgfg1.cpp -o /tmp/tmpazf3hiow/tmp/tmphnqxgfg1.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_correlate_9' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpazf3hiow/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmphnqxgfg1.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/correlate.hpp:6, from /tmp/tmphnqxgfg1.cpp:14: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmphnqxgfg1.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _________________ TestNormalizeMethods.test_dispatch_conjugate _________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_dispatch_conjugate', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpbx14ryfe.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpbx14ryfe.cpp'], output_dir = '/tmp/tmpr5n67m__' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpr5n67m__/tmp/tmpbx14ryfe.o', ('/tmp/tmpbx14ryfe.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpr5n67m__/tmp/tmpbx14ryfe.o', '/tmp/tmpbx14ryfe.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpr5n67m__/tmp/tmpbx14ryfe.o', src = '/tmp/tmpbx14ryfe.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpbx14ryfe.cpp -o /tmp/tmpr5n67m__/tmp/tmpbx14ryfe.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_dispatch_conjugate', cxxfile = '/tmp/tmpbx14ryfe.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpsd4m3iqz', buildtmp = '/tmp/tmpr5n67m__' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_dispatch_conjugate', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpbx14ryfe.cpp -o /tmp/tmpr5n67m__/tmp/tmpbx14ryfe.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_dispatch_conjugate(self): > self.run_test("def dispatch_conjugate(c, n): import numpy; return complex.conjugate(c), numpy.conjugate(n), c.conjugate(), n.conjugate()", 2.j, numpy.array([1.j+1.]), dispatch_conjugate=[complex, NDArray[complex, :]]) pythran/tests/test_normalize_methods.py:98: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_dispatch_conjugate', cxxfile = '/tmp/tmpbx14ryfe.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpsd4m3iqz', buildtmp = '/tmp/tmpr5n67m__' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpbx14ryfe.cpp -o /tmp/tmpr5n67m__/tmp/tmpbx14ryfe.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_dispatch_conjugate' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpr5n67m__/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpbx14ryfe.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/__dispatch__/conjugate.hpp:4, from /tmp/tmpbx14ryfe.cpp:12: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpbx14ryfe.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! __________________________ TestNumpyFunc3.test_vdot0 ___________________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_vdot0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpie4bhpeo.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpie4bhpeo.cpp'], output_dir = '/tmp/tmpp26kzs5n' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpp26kzs5n/tmp/tmpie4bhpeo.o', ('/tmp/tmpie4bhpeo.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpp26kzs5n/tmp/tmpie4bhpeo.o', '/tmp/tmpie4bhpeo.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpp26kzs5n/tmp/tmpie4bhpeo.o', src = '/tmp/tmpie4bhpeo.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpie4bhpeo.cpp -o /tmp/tmpp26kzs5n/tmp/tmpie4bhpeo.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_vdot0', cxxfile = '/tmp/tmpie4bhpeo.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp9ve27f_0', buildtmp = '/tmp/tmpp26kzs5n' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_vdot0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpie4bhpeo.cpp -o /tmp/tmpp26kzs5n/tmp/tmpie4bhpeo.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_vdot0(self): > self.run_test(""" def np_vdot0(x, y): from numpy import vdot return vdot(x, y)""", numpy.array(numpy.arange(6.).reshape(3, 2), dtype=numpy.float32), numpy.array(numpy.arange(6.).reshape(6), dtype=numpy.float32), np_vdot0=[NDArray[numpy.float32,:,:], NDArray[numpy.float32,:]]) pythran/tests/test_numpy_func3.py:308: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_vdot0', cxxfile = '/tmp/tmpie4bhpeo.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp9ve27f_0', buildtmp = '/tmp/tmpp26kzs5n' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpie4bhpeo.cpp -o /tmp/tmpp26kzs5n/tmp/tmpie4bhpeo.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_vdot0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpp26kzs5n/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpie4bhpeo.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/vdot.hpp:10, from /tmp/tmpie4bhpeo.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpie4bhpeo.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! __________________________ TestNumpyFunc3.test_vdot1 ___________________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_vdot1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp4ohwbn3a.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp4ohwbn3a.cpp'], output_dir = '/tmp/tmp_wyxnso5' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp_wyxnso5/tmp/tmp4ohwbn3a.o', ('/tmp/tmp4ohwbn3a.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp_wyxnso5/tmp/tmp4ohwbn3a.o', '/tmp/tmp4ohwbn3a.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp_wyxnso5/tmp/tmp4ohwbn3a.o', src = '/tmp/tmp4ohwbn3a.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp4ohwbn3a.cpp -o /tmp/tmp_wyxnso5/tmp/tmp4ohwbn3a.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_vdot1', cxxfile = '/tmp/tmp4ohwbn3a.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpjxjs28gu', buildtmp = '/tmp/tmp_wyxnso5' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_vdot1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp4ohwbn3a.cpp -o /tmp/tmp_wyxnso5/tmp/tmp4ohwbn3a.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_vdot1(self): > self.run_test(""" def np_vdot1(x, y): from numpy import vdot return vdot(x, y)""", numpy.array(numpy.arange(6.).reshape(3, 2), dtype=numpy.float32), numpy.array(numpy.arange(6.).reshape(6), dtype=numpy.float64), np_vdot1=[NDArray[numpy.float32,:,:], NDArray[numpy.float64,:]]) pythran/tests/test_numpy_func3.py:320: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_vdot1', cxxfile = '/tmp/tmp4ohwbn3a.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpjxjs28gu', buildtmp = '/tmp/tmp_wyxnso5' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp4ohwbn3a.cpp -o /tmp/tmp_wyxnso5/tmp/tmp4ohwbn3a.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_vdot1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp_wyxnso5/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp4ohwbn3a.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/vdot.hpp:10, from /tmp/tmp4ohwbn3a.cpp:15: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp4ohwbn3a.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! __________________________ TestNumpyFunc3.test_vdot2 ___________________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_vdot2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpxjwfqzn5.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpxjwfqzn5.cpp'], output_dir = '/tmp/tmpuu8qq81o' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpuu8qq81o/tmp/tmpxjwfqzn5.o', ('/tmp/tmpxjwfqzn5.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpuu8qq81o/tmp/tmpxjwfqzn5.o', '/tmp/tmpxjwfqzn5.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpuu8qq81o/tmp/tmpxjwfqzn5.o', src = '/tmp/tmpxjwfqzn5.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxjwfqzn5.cpp -o /tmp/tmpuu8qq81o/tmp/tmpxjwfqzn5.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_vdot2', cxxfile = '/tmp/tmpxjwfqzn5.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp4cha4ig4', buildtmp = '/tmp/tmpuu8qq81o' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_vdot2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxjwfqzn5.cpp -o /tmp/tmpuu8qq81o/tmp/tmpxjwfqzn5.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_vdot2(self): > self.run_test(""" def np_vdot2(x, y): from numpy import vdot return vdot(x, y)""", numpy.array(numpy.arange(6.).reshape(3, 2), dtype=numpy.complex128), numpy.array(numpy.arange(6.).reshape(6), dtype=numpy.complex128), np_vdot2=[NDArray[numpy.complex128,:,:], NDArray[numpy.complex128,:]]) pythran/tests/test_numpy_func3.py:332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_vdot2', cxxfile = '/tmp/tmpxjwfqzn5.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp4cha4ig4', buildtmp = '/tmp/tmpuu8qq81o' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxjwfqzn5.cpp -o /tmp/tmpuu8qq81o/tmp/tmpxjwfqzn5.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_vdot2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpuu8qq81o/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpxjwfqzn5.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/vdot.hpp:10, from /tmp/tmpxjwfqzn5.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpxjwfqzn5.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! __________________________ TestNumpyFunc3.test_vdot3 ___________________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_vdot3', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp84o2f0w_.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp84o2f0w_.cpp'], output_dir = '/tmp/tmpqek1y2dm' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpqek1y2dm/tmp/tmp84o2f0w_.o', ('/tmp/tmp84o2f0w_.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpqek1y2dm/tmp/tmp84o2f0w_.o', '/tmp/tmp84o2f0w_.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpqek1y2dm/tmp/tmp84o2f0w_.o', src = '/tmp/tmp84o2f0w_.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp84o2f0w_.cpp -o /tmp/tmpqek1y2dm/tmp/tmp84o2f0w_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_np_vdot3', cxxfile = '/tmp/tmp84o2f0w_.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpkqos6upj', buildtmp = '/tmp/tmpqek1y2dm' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_np_vdot3', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp84o2f0w_.cpp -o /tmp/tmpqek1y2dm/tmp/tmp84o2f0w_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_vdot3(self): > self.run_test(""" def np_vdot3(x, y): from numpy import vdot return vdot(x, y)""", numpy.array(numpy.arange(6.), dtype=numpy.complex128), numpy.array(numpy.arange(6.), dtype=numpy.complex128) * -1j, np_vdot3=[NDArray[numpy.complex128,:], NDArray[numpy.complex128,:]]) pythran/tests/test_numpy_func3.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_np_vdot3', cxxfile = '/tmp/tmp84o2f0w_.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpkqos6upj', buildtmp = '/tmp/tmpqek1y2dm' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp84o2f0w_.cpp -o /tmp/tmpqek1y2dm/tmp/tmp84o2f0w_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_np_vdot3' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpqek1y2dm/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp84o2f0w_.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/vdot.hpp:10, from /tmp/tmp84o2f0w_.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp84o2f0w_.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyLinalg.test_linalg_norm0 _______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpnybmu5rr.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpnybmu5rr.cpp'], output_dir = '/tmp/tmpgmzljvhz' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpgmzljvhz/tmp/tmpnybmu5rr.o', ('/tmp/tmpnybmu5rr.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpgmzljvhz/tmp/tmpnybmu5rr.o', '/tmp/tmpnybmu5rr.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpgmzljvhz/tmp/tmpnybmu5rr.o', src = '/tmp/tmpnybmu5rr.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpnybmu5rr.cpp -o /tmp/tmpgmzljvhz/tmp/tmpnybmu5rr.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_linalg_norm0', cxxfile = '/tmp/tmpnybmu5rr.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpog8el757', buildtmp = '/tmp/tmpgmzljvhz' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpnybmu5rr.cpp -o /tmp/tmpgmzljvhz/tmp/tmpnybmu5rr.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_linalg_norm0(self): > self.run_test("def linalg_norm0(x): from numpy.linalg import norm ; return norm(x)", numpy.arange(6.), linalg_norm0=[NDArray[float,:]]) pythran/tests/test_numpy_linalg.py:11: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_linalg_norm0', cxxfile = '/tmp/tmpnybmu5rr.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpog8el757', buildtmp = '/tmp/tmpgmzljvhz' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpnybmu5rr.cpp -o /tmp/tmpgmzljvhz/tmp/tmpnybmu5rr.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_linalg_norm0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpgmzljvhz/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpnybmu5rr.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conj.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/linalg/norm.hpp:5, from /tmp/tmpnybmu5rr.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpnybmu5rr.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyLinalg.test_linalg_norm1 _______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpzhe7w8hh.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpzhe7w8hh.cpp'], output_dir = '/tmp/tmpi3y87alc' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpi3y87alc/tmp/tmpzhe7w8hh.o', ('/tmp/tmpzhe7w8hh.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpi3y87alc/tmp/tmpzhe7w8hh.o', '/tmp/tmpzhe7w8hh.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpi3y87alc/tmp/tmpzhe7w8hh.o', src = '/tmp/tmpzhe7w8hh.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzhe7w8hh.cpp -o /tmp/tmpi3y87alc/tmp/tmpzhe7w8hh.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_linalg_norm1', cxxfile = '/tmp/tmpzhe7w8hh.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpc_6lcwbp', buildtmp = '/tmp/tmpi3y87alc' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzhe7w8hh.cpp -o /tmp/tmpi3y87alc/tmp/tmpzhe7w8hh.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_linalg_norm1(self): > self.run_test("def linalg_norm1(x): from numpy.linalg import norm ; return norm(x)", numpy.arange(6.).reshape(2,3), linalg_norm1=[NDArray[float,:,:]]) pythran/tests/test_numpy_linalg.py:14: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_linalg_norm1', cxxfile = '/tmp/tmpzhe7w8hh.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpc_6lcwbp', buildtmp = '/tmp/tmpi3y87alc' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzhe7w8hh.cpp -o /tmp/tmpi3y87alc/tmp/tmpzhe7w8hh.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_linalg_norm1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpi3y87alc/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpzhe7w8hh.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conj.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/linalg/norm.hpp:5, from /tmp/tmpzhe7w8hh.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpzhe7w8hh.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyLinalg.test_linalg_norm2 _______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpboj0yuzt.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpboj0yuzt.cpp'], output_dir = '/tmp/tmp6kmxk2jy' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp6kmxk2jy/tmp/tmpboj0yuzt.o', ('/tmp/tmpboj0yuzt.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp6kmxk2jy/tmp/tmpboj0yuzt.o', '/tmp/tmpboj0yuzt.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp6kmxk2jy/tmp/tmpboj0yuzt.o', src = '/tmp/tmpboj0yuzt.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpboj0yuzt.cpp -o /tmp/tmp6kmxk2jy/tmp/tmpboj0yuzt.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_linalg_norm2', cxxfile = '/tmp/tmpboj0yuzt.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpole62l7v', buildtmp = '/tmp/tmp6kmxk2jy' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpboj0yuzt.cpp -o /tmp/tmp6kmxk2jy/tmp/tmpboj0yuzt.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_linalg_norm2(self): > self.run_test("def linalg_norm2(x): from numpy.linalg import norm ; from numpy import inf ; return norm(x, inf)", numpy.arange(6.), linalg_norm2=[NDArray[float,:]]) pythran/tests/test_numpy_linalg.py:17: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_linalg_norm2', cxxfile = '/tmp/tmpboj0yuzt.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpole62l7v', buildtmp = '/tmp/tmp6kmxk2jy' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpboj0yuzt.cpp -o /tmp/tmp6kmxk2jy/tmp/tmpboj0yuzt.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_linalg_norm2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp6kmxk2jy/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpboj0yuzt.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conj.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/linalg/norm.hpp:5, from /tmp/tmpboj0yuzt.cpp:15: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpboj0yuzt.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyLinalg.test_linalg_norm3 _______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm3', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp5h3rdect.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp5h3rdect.cpp'], output_dir = '/tmp/tmpc9v9eedq' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpc9v9eedq/tmp/tmp5h3rdect.o', ('/tmp/tmp5h3rdect.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpc9v9eedq/tmp/tmp5h3rdect.o', '/tmp/tmp5h3rdect.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpc9v9eedq/tmp/tmp5h3rdect.o', src = '/tmp/tmp5h3rdect.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp5h3rdect.cpp -o /tmp/tmpc9v9eedq/tmp/tmp5h3rdect.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_linalg_norm3', cxxfile = '/tmp/tmp5h3rdect.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp5qnpbq1e', buildtmp = '/tmp/tmpc9v9eedq' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm3', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp5h3rdect.cpp -o /tmp/tmpc9v9eedq/tmp/tmp5h3rdect.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_linalg_norm3(self): > self.run_test("def linalg_norm3(x): from numpy.linalg import norm ; from numpy import inf ; return norm(x, -inf)", numpy.arange(6.), linalg_norm3=[NDArray[float,:]]) pythran/tests/test_numpy_linalg.py:20: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_linalg_norm3', cxxfile = '/tmp/tmp5h3rdect.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp5qnpbq1e', buildtmp = '/tmp/tmpc9v9eedq' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp5h3rdect.cpp -o /tmp/tmpc9v9eedq/tmp/tmp5h3rdect.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_linalg_norm3' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpc9v9eedq/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp5h3rdect.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conj.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/linalg/norm.hpp:5, from /tmp/tmp5h3rdect.cpp:15: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp5h3rdect.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyLinalg.test_linalg_norm4 _______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm4', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpgejhdxrs.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpgejhdxrs.cpp'], output_dir = '/tmp/tmpcgj7o40p' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpcgj7o40p/tmp/tmpgejhdxrs.o', ('/tmp/tmpgejhdxrs.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpcgj7o40p/tmp/tmpgejhdxrs.o', '/tmp/tmpgejhdxrs.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpcgj7o40p/tmp/tmpgejhdxrs.o', src = '/tmp/tmpgejhdxrs.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpgejhdxrs.cpp -o /tmp/tmpcgj7o40p/tmp/tmpgejhdxrs.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_linalg_norm4', cxxfile = '/tmp/tmpgejhdxrs.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpxi1g6x3p', buildtmp = '/tmp/tmpcgj7o40p' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm4', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpgejhdxrs.cpp -o /tmp/tmpcgj7o40p/tmp/tmpgejhdxrs.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_linalg_norm4(self): > self.run_test("def linalg_norm4(x): from numpy.linalg import norm ; from numpy import inf ; return norm(x, 0)", numpy.arange(6.), linalg_norm4=[NDArray[float,:]]) pythran/tests/test_numpy_linalg.py:23: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_linalg_norm4', cxxfile = '/tmp/tmpgejhdxrs.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpxi1g6x3p', buildtmp = '/tmp/tmpcgj7o40p' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpgejhdxrs.cpp -o /tmp/tmpcgj7o40p/tmp/tmpgejhdxrs.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_linalg_norm4' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpcgj7o40p/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpgejhdxrs.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conj.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/linalg/norm.hpp:5, from /tmp/tmpgejhdxrs.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpgejhdxrs.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyLinalg.test_linalg_norm5 _______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm5', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpv89ny4w0.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpv89ny4w0.cpp'], output_dir = '/tmp/tmpp05yqr76' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpp05yqr76/tmp/tmpv89ny4w0.o', ('/tmp/tmpv89ny4w0.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpp05yqr76/tmp/tmpv89ny4w0.o', '/tmp/tmpv89ny4w0.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpp05yqr76/tmp/tmpv89ny4w0.o', src = '/tmp/tmpv89ny4w0.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpv89ny4w0.cpp -o /tmp/tmpp05yqr76/tmp/tmpv89ny4w0.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_linalg_norm5', cxxfile = '/tmp/tmpv89ny4w0.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp2r36ukqd', buildtmp = '/tmp/tmpp05yqr76' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm5', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpv89ny4w0.cpp -o /tmp/tmpp05yqr76/tmp/tmpv89ny4w0.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_linalg_norm5(self): > self.run_test("def linalg_norm5(x): from numpy.linalg import norm ; from numpy import inf ; return norm(x, ord=inf, axis=1)", (numpy.arange(9) - 4).reshape((3,3)), linalg_norm5=[NDArray[int,:,:]]) pythran/tests/test_numpy_linalg.py:26: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_linalg_norm5', cxxfile = '/tmp/tmpv89ny4w0.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp2r36ukqd', buildtmp = '/tmp/tmpp05yqr76' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpv89ny4w0.cpp -o /tmp/tmpp05yqr76/tmp/tmpv89ny4w0.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_linalg_norm5' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpp05yqr76/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpv89ny4w0.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conj.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/linalg/norm.hpp:5, from /tmp/tmpv89ny4w0.cpp:15: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpv89ny4w0.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyLinalg.test_linalg_norm6 _______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm6', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpjbudsh0q.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpjbudsh0q.cpp'], output_dir = '/tmp/tmpvl69mw5x' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpvl69mw5x/tmp/tmpjbudsh0q.o', ('/tmp/tmpjbudsh0q.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpvl69mw5x/tmp/tmpjbudsh0q.o', '/tmp/tmpjbudsh0q.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpvl69mw5x/tmp/tmpjbudsh0q.o', src = '/tmp/tmpjbudsh0q.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpjbudsh0q.cpp -o /tmp/tmpvl69mw5x/tmp/tmpjbudsh0q.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_linalg_norm6', cxxfile = '/tmp/tmpjbudsh0q.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmprc1k5sbu', buildtmp = '/tmp/tmpvl69mw5x' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm6', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpjbudsh0q.cpp -o /tmp/tmpvl69mw5x/tmp/tmpjbudsh0q.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_linalg_norm6(self): > self.run_test("def linalg_norm6(x): from numpy.linalg import norm ; from numpy import inf ; return norm(x, ord=5, axis=(0,))", (numpy.arange(9) - 4).reshape((3,3)), linalg_norm6=[NDArray[int,:,:]]) pythran/tests/test_numpy_linalg.py:29: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_linalg_norm6', cxxfile = '/tmp/tmpjbudsh0q.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmprc1k5sbu', buildtmp = '/tmp/tmpvl69mw5x' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpjbudsh0q.cpp -o /tmp/tmpvl69mw5x/tmp/tmpjbudsh0q.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_linalg_norm6' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpvl69mw5x/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpjbudsh0q.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conj.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/linalg/norm.hpp:5, from /tmp/tmpjbudsh0q.cpp:15: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpjbudsh0q.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyLinalg.test_linalg_norm7 _______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm7', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpgv8g998e.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpgv8g998e.cpp'], output_dir = '/tmp/tmp6dnak1k8' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp6dnak1k8/tmp/tmpgv8g998e.o', ('/tmp/tmpgv8g998e.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp6dnak1k8/tmp/tmpgv8g998e.o', '/tmp/tmpgv8g998e.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp6dnak1k8/tmp/tmpgv8g998e.o', src = '/tmp/tmpgv8g998e.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpgv8g998e.cpp -o /tmp/tmp6dnak1k8/tmp/tmpgv8g998e.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_linalg_norm7', cxxfile = '/tmp/tmpgv8g998e.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp47endflr', buildtmp = '/tmp/tmp6dnak1k8' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm7', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpgv8g998e.cpp -o /tmp/tmp6dnak1k8/tmp/tmpgv8g998e.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_linalg_norm7(self): > self.run_test("def linalg_norm7(x): from numpy.linalg import norm ; return norm(x)", numpy.arange(6).reshape(2,3), linalg_norm7=[NDArray[int,:,:]]) pythran/tests/test_numpy_linalg.py:32: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_linalg_norm7', cxxfile = '/tmp/tmpgv8g998e.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp47endflr', buildtmp = '/tmp/tmp6dnak1k8' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpgv8g998e.cpp -o /tmp/tmp6dnak1k8/tmp/tmpgv8g998e.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_linalg_norm7' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp6dnak1k8/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpgv8g998e.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conj.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/linalg/norm.hpp:5, from /tmp/tmpgv8g998e.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpgv8g998e.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyLinalg.test_linalg_norm8 _______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm8', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp9jr0c11d.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp9jr0c11d.cpp'], output_dir = '/tmp/tmpjg0r1d0j' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpjg0r1d0j/tmp/tmp9jr0c11d.o', ('/tmp/tmp9jr0c11d.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpjg0r1d0j/tmp/tmp9jr0c11d.o', '/tmp/tmp9jr0c11d.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpjg0r1d0j/tmp/tmp9jr0c11d.o', src = '/tmp/tmp9jr0c11d.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9jr0c11d.cpp -o /tmp/tmpjg0r1d0j/tmp/tmp9jr0c11d.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_linalg_norm8', cxxfile = '/tmp/tmp9jr0c11d.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpztv9zwvh', buildtmp = '/tmp/tmpjg0r1d0j' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm8', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9jr0c11d.cpp -o /tmp/tmpjg0r1d0j/tmp/tmp9jr0c11d.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_linalg_norm8(self): > self.run_test("def linalg_norm8(x): from numpy.linalg import norm ; return norm(x)", numpy.arange(6).reshape(2,3) * 1j + 1, linalg_norm8=[NDArray[complex,:,:]]) pythran/tests/test_numpy_linalg.py:35: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_linalg_norm8', cxxfile = '/tmp/tmp9jr0c11d.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpztv9zwvh', buildtmp = '/tmp/tmpjg0r1d0j' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9jr0c11d.cpp -o /tmp/tmpjg0r1d0j/tmp/tmp9jr0c11d.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_linalg_norm8' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpjg0r1d0j/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp9jr0c11d.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conj.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/linalg/norm.hpp:5, from /tmp/tmp9jr0c11d.cpp:13: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp9jr0c11d.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyLinalg.test_linalg_norm_pydoc ____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm_pydoc', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpgn0mnq1x.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpgn0mnq1x.cpp'], output_dir = '/tmp/tmpo7snxpqy' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpo7snxpqy/tmp/tmpgn0mnq1x.o', ('/tmp/tmpgn0mnq1x.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpo7snxpqy/tmp/tmpgn0mnq1x.o', '/tmp/tmpgn0mnq1x.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpo7snxpqy/tmp/tmpgn0mnq1x.o', src = '/tmp/tmpgn0mnq1x.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpgn0mnq1x.cpp -o /tmp/tmpo7snxpqy/tmp/tmpgn0mnq1x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_linalg_norm_pydoc', cxxfile = '/tmp/tmpgn0mnq1x.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp1_nbag9l', buildtmp = '/tmp/tmpo7snxpqy' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_linalg_norm_pydoc', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpgn0mnq1x.cpp -o /tmp/tmpo7snxpqy/tmp/tmpgn0mnq1x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_linalg_norm_pydoc(self): > self.run_test(''' def linalg_norm_pydoc(x): import numpy as np from numpy import linalg as LA a = np.arange(9) - x b = a.reshape((3, 3)) c = np.array([[ 1, 2, 3], [-1, 1, x]]) return (LA.norm(a), LA.norm(b), LA.norm(a, np.Inf), #LA.norm(b, np.inf), LA.norm(a, -np.inf), #LA.norm(b, -np.inf), LA.norm(a, 1), #LA.norm(b, 1), LA.norm(a, -1), #LA.norm(b, -1), LA.norm(a, 2), #LA.norm(b, 2), LA.norm(a, -2), #LA.norm(b, -2), LA.norm(a, 3), LA.norm(a, -3), LA.norm(c, axis=0), LA.norm(c, axis=1), LA.norm(c, ord=1, axis=1), )''', 10, linalg_norm_pydoc=[int]) pythran/tests/test_numpy_linalg.py:40: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_linalg_norm_pydoc', cxxfile = '/tmp/tmpgn0mnq1x.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp1_nbag9l', buildtmp = '/tmp/tmpo7snxpqy' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpgn0mnq1x.cpp -o /tmp/tmpo7snxpqy/tmp/tmpgn0mnq1x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_linalg_norm_pydoc' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpo7snxpqy/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpgn0mnq1x.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conj.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/linalg/norm.hpp:5, from /tmp/tmpgn0mnq1x.cpp:27: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpgn0mnq1x.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_binomial0 _____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_binomial0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp9gw6oiol.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp9gw6oiol.cpp'], output_dir = '/tmp/tmpsgobe3ci' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpsgobe3ci/tmp/tmp9gw6oiol.o', ('/tmp/tmp9gw6oiol.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpsgobe3ci/tmp/tmp9gw6oiol.o', '/tmp/tmp9gw6oiol.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpsgobe3ci/tmp/tmp9gw6oiol.o', src = '/tmp/tmp9gw6oiol.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9gw6oiol.cpp -o /tmp/tmpsgobe3ci/tmp/tmp9gw6oiol.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_binomial0', cxxfile = '/tmp/tmp9gw6oiol.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpop60lgpp', buildtmp = '/tmp/tmpsgobe3ci' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_binomial0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9gw6oiol.cpp -o /tmp/tmpsgobe3ci/tmp/tmp9gw6oiol.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_binomial0(self): code = """ def numpy_binomial0(n, p, size): from numpy.random import binomial from numpy import var a = [binomial(n, p) for x in range(size)] return (abs(float(sum(a))/size - n * p) < .05 and abs(var(a) - n*p*(1-p)) < .05) """ > self.run_test(code, 10., .2, 10**5, numpy_binomial0=[float, float, int]) pythran/tests/test_numpy_random.py:186: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_binomial0', cxxfile = '/tmp/tmp9gw6oiol.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpop60lgpp', buildtmp = '/tmp/tmpsgobe3ci' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9gw6oiol.cpp -o /tmp/tmpsgobe3ci/tmp/tmp9gw6oiol.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_binomial0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpsgobe3ci/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp9gw6oiol.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp9gw6oiol.cpp:35: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp9gw6oiol.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_binomial1 _____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_binomial1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpfhgxt6sd.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpfhgxt6sd.cpp'], output_dir = '/tmp/tmpdf9p4w43' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpdf9p4w43/tmp/tmpfhgxt6sd.o', ('/tmp/tmpfhgxt6sd.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpdf9p4w43/tmp/tmpfhgxt6sd.o', '/tmp/tmpfhgxt6sd.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpdf9p4w43/tmp/tmpfhgxt6sd.o', src = '/tmp/tmpfhgxt6sd.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfhgxt6sd.cpp -o /tmp/tmpdf9p4w43/tmp/tmpfhgxt6sd.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_binomial1', cxxfile = '/tmp/tmpfhgxt6sd.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpz7w4ndwm', buildtmp = '/tmp/tmpdf9p4w43' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_binomial1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfhgxt6sd.cpp -o /tmp/tmpdf9p4w43/tmp/tmpfhgxt6sd.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_binomial1(self): code = """ def numpy_binomial1(n, p, size): from numpy.random import binomial from numpy import var a=binomial(n, p, size) return (abs(float(sum(a))/size - n * p) < .05 and abs(var(a) - n*p*(1-p)) < .05) """ > self.run_test(code, 7., .2, 10**5, numpy_binomial1=[float, float, int]) pythran/tests/test_numpy_random.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_binomial1', cxxfile = '/tmp/tmpfhgxt6sd.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpz7w4ndwm', buildtmp = '/tmp/tmpdf9p4w43' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfhgxt6sd.cpp -o /tmp/tmpdf9p4w43/tmp/tmpfhgxt6sd.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_binomial1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpdf9p4w43/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpfhgxt6sd.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpfhgxt6sd.cpp:27: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpfhgxt6sd.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_binomial2 _____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_binomial2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpabxzfs4e.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpabxzfs4e.cpp'], output_dir = '/tmp/tmpf3sfvj20' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpf3sfvj20/tmp/tmpabxzfs4e.o', ('/tmp/tmpabxzfs4e.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpf3sfvj20/tmp/tmpabxzfs4e.o', '/tmp/tmpabxzfs4e.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpf3sfvj20/tmp/tmpabxzfs4e.o', src = '/tmp/tmpabxzfs4e.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpabxzfs4e.cpp -o /tmp/tmpf3sfvj20/tmp/tmpabxzfs4e.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_binomial2', cxxfile = '/tmp/tmpabxzfs4e.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp975uxoi5', buildtmp = '/tmp/tmpf3sfvj20' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_binomial2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpabxzfs4e.cpp -o /tmp/tmpf3sfvj20/tmp/tmpabxzfs4e.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_binomial2(self): code = """ def numpy_binomial2(n, p, size): from numpy.random import binomial from numpy import sum, var a=binomial(n, p, (size, size)) return (abs(float(sum(a))/(size*size) - n * p) < .05 and abs(var(a) - n*p*(1-p)) < .05) """ > self.run_test(code, 9., .2, 10**3, numpy_binomial2=[float, float, int]) pythran/tests/test_numpy_random.py:206: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_binomial2', cxxfile = '/tmp/tmpabxzfs4e.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp975uxoi5', buildtmp = '/tmp/tmpf3sfvj20' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpabxzfs4e.cpp -o /tmp/tmpf3sfvj20/tmp/tmpabxzfs4e.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_binomial2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpf3sfvj20/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpabxzfs4e.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpabxzfs4e.cpp:31: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpabxzfs4e.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_chisquare0a ____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_chisquare0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpxi3c124n.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpxi3c124n.cpp'], output_dir = '/tmp/tmpqztdd_wg' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpqztdd_wg/tmp/tmpxi3c124n.o', ('/tmp/tmpxi3c124n.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpqztdd_wg/tmp/tmpxi3c124n.o', '/tmp/tmpxi3c124n.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpqztdd_wg/tmp/tmpxi3c124n.o', src = '/tmp/tmpxi3c124n.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxi3c124n.cpp -o /tmp/tmpqztdd_wg/tmp/tmpxi3c124n.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_chisquare0a', cxxfile = '/tmp/tmpxi3c124n.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpdl27esum', buildtmp = '/tmp/tmpqztdd_wg' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_chisquare0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxi3c124n.cpp -o /tmp/tmpqztdd_wg/tmp/tmpxi3c124n.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_chisquare0a(self): """ Check chisquare with 1 argument with mean and variance. """ code = """ def numpy_chisquare0a(size): from numpy.random import chisquare from numpy import var, mean df = 3. a = [chisquare(df) for x in range(size)] return (abs(mean(a) - df) < 0.05 and abs(var(a) - 2*df) < .05) """ > self.run_test(code, 10 ** 6, numpy_chisquare0a=[int]) pythran/tests/test_numpy_random.py:722: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_chisquare0a', cxxfile = '/tmp/tmpxi3c124n.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpdl27esum', buildtmp = '/tmp/tmpqztdd_wg' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxi3c124n.cpp -o /tmp/tmpqztdd_wg/tmp/tmpxi3c124n.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_chisquare0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpqztdd_wg/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpxi3c124n.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpxi3c124n.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpxi3c124n.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_chisquare0b ____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_chisquare0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmprjwfaypj.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmprjwfaypj.cpp'], output_dir = '/tmp/tmpm5xq82dc' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpm5xq82dc/tmp/tmprjwfaypj.o', ('/tmp/tmprjwfaypj.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpm5xq82dc/tmp/tmprjwfaypj.o', '/tmp/tmprjwfaypj.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpm5xq82dc/tmp/tmprjwfaypj.o', src = '/tmp/tmprjwfaypj.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmprjwfaypj.cpp -o /tmp/tmpm5xq82dc/tmp/tmprjwfaypj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_chisquare0b', cxxfile = '/tmp/tmprjwfaypj.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpdaryfenz', buildtmp = '/tmp/tmpm5xq82dc' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_chisquare0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmprjwfaypj.cpp -o /tmp/tmpm5xq82dc/tmp/tmprjwfaypj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_chisquare0b(self): """ Check chisquare with 2 argument with mean and variance. """ code = """ def numpy_chisquare0b(size): from numpy.random import chisquare from numpy import var, mean, sqrt df = 2 a = chisquare(df, size) return (abs(mean(a) - df) < 0.05 and abs(var(a) - df*2 ) < .05) """ > self.run_test(code, 10 ** 6, numpy_chisquare0b=[int]) pythran/tests/test_numpy_random.py:734: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_chisquare0b', cxxfile = '/tmp/tmprjwfaypj.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpdaryfenz', buildtmp = '/tmp/tmpm5xq82dc' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmprjwfaypj.cpp -o /tmp/tmpm5xq82dc/tmp/tmprjwfaypj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_chisquare0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpm5xq82dc/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmprjwfaypj.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmprjwfaypj.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmprjwfaypj.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_chisquare2 _____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_chisquare2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpfhfogftp.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpfhfogftp.cpp'], output_dir = '/tmp/tmp68hy13lb' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp68hy13lb/tmp/tmpfhfogftp.o', ('/tmp/tmpfhfogftp.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp68hy13lb/tmp/tmpfhfogftp.o', '/tmp/tmpfhfogftp.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp68hy13lb/tmp/tmpfhfogftp.o', src = '/tmp/tmpfhfogftp.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfhfogftp.cpp -o /tmp/tmp68hy13lb/tmp/tmpfhfogftp.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_chisquare2', cxxfile = '/tmp/tmpfhfogftp.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmphoqiam7u', buildtmp = '/tmp/tmp68hy13lb' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_chisquare2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfhfogftp.cpp -o /tmp/tmp68hy13lb/tmp/tmpfhfogftp.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_chisquare2(self): """Check chisquare with shape argument with mean and variance.""" code = """ def numpy_chisquare2(size): from numpy.random import chisquare from numpy import mean, var df = 1 a = chisquare(df, size=(size, size)) return (abs(mean(a)) - df < .05 and abs(var(a) - 2*df) < .05) """ > self.run_test(code, 10 ** 3, numpy_chisquare2=[int]) pythran/tests/test_numpy_random.py:746: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_chisquare2', cxxfile = '/tmp/tmpfhfogftp.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmphoqiam7u', buildtmp = '/tmp/tmp68hy13lb' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfhfogftp.cpp -o /tmp/tmp68hy13lb/tmp/tmpfhfogftp.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_chisquare2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp68hy13lb/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpfhfogftp.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpfhfogftp.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpfhfogftp.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________ TestNumpyRandom.test_numpy_exponential0 ____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_exponential0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmphwo4elhh.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmphwo4elhh.cpp'], output_dir = '/tmp/tmpvyi5mbcr' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpvyi5mbcr/tmp/tmphwo4elhh.o', ('/tmp/tmphwo4elhh.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpvyi5mbcr/tmp/tmphwo4elhh.o', '/tmp/tmphwo4elhh.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpvyi5mbcr/tmp/tmphwo4elhh.o', src = '/tmp/tmphwo4elhh.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphwo4elhh.cpp -o /tmp/tmpvyi5mbcr/tmp/tmphwo4elhh.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_exponential0', cxxfile = '/tmp/tmphwo4elhh.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppcqd8w_6', buildtmp = '/tmp/tmpvyi5mbcr' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_exponential0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphwo4elhh.cpp -o /tmp/tmpvyi5mbcr/tmp/tmphwo4elhh.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_exponential0(self): """ Check exponential without argument with mean and variance. """ code = """ def numpy_exponential0(size): from numpy.random import exponential from numpy import var, mean a = [exponential() for x in range(size)] return (abs(mean(a) -1) < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 6, numpy_exponential0=[int]) pythran/tests/test_numpy_random.py:660: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_exponential0', cxxfile = '/tmp/tmphwo4elhh.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppcqd8w_6', buildtmp = '/tmp/tmpvyi5mbcr' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphwo4elhh.cpp -o /tmp/tmpvyi5mbcr/tmp/tmphwo4elhh.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_exponential0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpvyi5mbcr/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmphwo4elhh.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmphwo4elhh.cpp:27: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmphwo4elhh.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________ TestNumpyRandom.test_numpy_exponential0a ___________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_exponential0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpa5uhrjq8.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpa5uhrjq8.cpp'], output_dir = '/tmp/tmp2stifzzd' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp2stifzzd/tmp/tmpa5uhrjq8.o', ('/tmp/tmpa5uhrjq8.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp2stifzzd/tmp/tmpa5uhrjq8.o', '/tmp/tmpa5uhrjq8.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp2stifzzd/tmp/tmpa5uhrjq8.o', src = '/tmp/tmpa5uhrjq8.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpa5uhrjq8.cpp -o /tmp/tmp2stifzzd/tmp/tmpa5uhrjq8.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_exponential0a', cxxfile = '/tmp/tmpa5uhrjq8.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpfs1327rc', buildtmp = '/tmp/tmp2stifzzd' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_exponential0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpa5uhrjq8.cpp -o /tmp/tmp2stifzzd/tmp/tmpa5uhrjq8.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_exponential0a(self): """ Check exponential with 1 argument with mean and variance. """ code = """ def numpy_exponential0a(size): from numpy.random import exponential from numpy import var, mean scale = 2. a = [exponential(scale) for x in range(size)] return (abs(mean(a) - scale) < 0.05 and abs(var(a) - scale**2) < .05) """ > self.run_test(code, 10 ** 6, numpy_exponential0a=[int]) pythran/tests/test_numpy_random.py:672: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_exponential0a', cxxfile = '/tmp/tmpa5uhrjq8.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpfs1327rc', buildtmp = '/tmp/tmp2stifzzd' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpa5uhrjq8.cpp -o /tmp/tmp2stifzzd/tmp/tmpa5uhrjq8.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_exponential0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp2stifzzd/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpa5uhrjq8.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpa5uhrjq8.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpa5uhrjq8.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________ TestNumpyRandom.test_numpy_exponential0b ___________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_exponential0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpz4jutpl0.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpz4jutpl0.cpp'], output_dir = '/tmp/tmpcnn1ibvt' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpcnn1ibvt/tmp/tmpz4jutpl0.o', ('/tmp/tmpz4jutpl0.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpcnn1ibvt/tmp/tmpz4jutpl0.o', '/tmp/tmpz4jutpl0.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpcnn1ibvt/tmp/tmpz4jutpl0.o', src = '/tmp/tmpz4jutpl0.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpz4jutpl0.cpp -o /tmp/tmpcnn1ibvt/tmp/tmpz4jutpl0.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_exponential0b', cxxfile = '/tmp/tmpz4jutpl0.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpcyzma1do', buildtmp = '/tmp/tmpcnn1ibvt' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_exponential0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpz4jutpl0.cpp -o /tmp/tmpcnn1ibvt/tmp/tmpz4jutpl0.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_exponential0b(self): """ Check exponential with 2 argument with mean and variance. """ code = """ def numpy_exponential0b(size): from numpy.random import exponential from numpy import var, mean, sqrt scale = 2 a = exponential(scale, size) return (abs(mean(a) - scale) < 0.05 and abs(var(a,ddof=1) - scale**2 ) < .05) """ > self.run_test(code, 10 ** 6, numpy_exponential0b=[int]) pythran/tests/test_numpy_random.py:684: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_exponential0b', cxxfile = '/tmp/tmpz4jutpl0.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpcyzma1do', buildtmp = '/tmp/tmpcnn1ibvt' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpz4jutpl0.cpp -o /tmp/tmpcnn1ibvt/tmp/tmpz4jutpl0.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_exponential0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpcnn1ibvt/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpz4jutpl0.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpz4jutpl0.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpz4jutpl0.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_laplace1 ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_laplace1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpnuo602np.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpnuo602np.cpp'], output_dir = '/tmp/tmpu74xxvsk' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpu74xxvsk/tmp/tmpnuo602np.o', ('/tmp/tmpnuo602np.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpu74xxvsk/tmp/tmpnuo602np.o', '/tmp/tmpnuo602np.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpu74xxvsk/tmp/tmpnuo602np.o', src = '/tmp/tmpnuo602np.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpnuo602np.cpp -o /tmp/tmpu74xxvsk/tmp/tmpnuo602np.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_laplace1', cxxfile = '/tmp/tmpnuo602np.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpqsy2c5vt', buildtmp = '/tmp/tmpu74xxvsk' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_laplace1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpnuo602np.cpp -o /tmp/tmpu74xxvsk/tmp/tmpnuo602np.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_laplace1(self): """ Check laplace with size argument with mean and variance.""" code = """ def numpy_laplace1(size): from numpy.random import laplace from numpy import var, mean from numpy import var, mean, pi u = 0. s = 1 rmean = u rvar = 2*s**2 a = laplace(size=size) return (abs(mean(a) - rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 6, numpy_laplace1=[int]) pythran/tests/test_numpy_random.py:1506: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_laplace1', cxxfile = '/tmp/tmpnuo602np.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpqsy2c5vt', buildtmp = '/tmp/tmpu74xxvsk' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpnuo602np.cpp -o /tmp/tmpu74xxvsk/tmp/tmpnuo602np.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_laplace1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpu74xxvsk/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpnuo602np.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpnuo602np.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpnuo602np.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________ TestNumpyRandom.test_numpy_exponential1 ____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_exponential1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp3yl_4id2.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp3yl_4id2.cpp'], output_dir = '/tmp/tmpejqz2j62' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpejqz2j62/tmp/tmp3yl_4id2.o', ('/tmp/tmp3yl_4id2.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpejqz2j62/tmp/tmp3yl_4id2.o', '/tmp/tmp3yl_4id2.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpejqz2j62/tmp/tmp3yl_4id2.o', src = '/tmp/tmp3yl_4id2.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp3yl_4id2.cpp -o /tmp/tmpejqz2j62/tmp/tmp3yl_4id2.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_exponential1', cxxfile = '/tmp/tmp3yl_4id2.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpitb474pb', buildtmp = '/tmp/tmpejqz2j62' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_exponential1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp3yl_4id2.cpp -o /tmp/tmpejqz2j62/tmp/tmp3yl_4id2.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_exponential1(self): """ Check exponential with size argument with mean and variance.""" code = """ def numpy_exponential1(size): from numpy.random import exponential from numpy import var, mean a = exponential(size=size) return (abs(mean(a) -1 )< .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 6, numpy_exponential1=[int]) pythran/tests/test_numpy_random.py:695: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_exponential1', cxxfile = '/tmp/tmp3yl_4id2.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpitb474pb', buildtmp = '/tmp/tmpejqz2j62' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp3yl_4id2.cpp -o /tmp/tmpejqz2j62/tmp/tmp3yl_4id2.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_exponential1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpejqz2j62/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp3yl_4id2.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp3yl_4id2.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp3yl_4id2.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ___________________ TestNumpyRandom.test_numpy_exponential2 ____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_exponential2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpqzw43c9d.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpqzw43c9d.cpp'], output_dir = '/tmp/tmp7t8k03eo' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp7t8k03eo/tmp/tmpqzw43c9d.o', ('/tmp/tmpqzw43c9d.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp7t8k03eo/tmp/tmpqzw43c9d.o', '/tmp/tmpqzw43c9d.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp7t8k03eo/tmp/tmpqzw43c9d.o', src = '/tmp/tmpqzw43c9d.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpqzw43c9d.cpp -o /tmp/tmp7t8k03eo/tmp/tmpqzw43c9d.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_exponential2', cxxfile = '/tmp/tmpqzw43c9d.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp3p1k8gl_', buildtmp = '/tmp/tmp7t8k03eo' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_exponential2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpqzw43c9d.cpp -o /tmp/tmp7t8k03eo/tmp/tmpqzw43c9d.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_exponential2(self): """Check exponential with shape argument with mean and variance.""" code = """ def numpy_exponential2(size): from numpy.random import exponential from numpy import mean, var a = exponential(size=(size, size)) return (abs(mean(a)) -1 < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 3, numpy_exponential2=[int]) pythran/tests/test_numpy_random.py:706: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_exponential2', cxxfile = '/tmp/tmpqzw43c9d.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp3p1k8gl_', buildtmp = '/tmp/tmp7t8k03eo' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpqzw43c9d.cpp -o /tmp/tmp7t8k03eo/tmp/tmpqzw43c9d.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_exponential2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp7t8k03eo/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpqzw43c9d.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpqzw43c9d.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpqzw43c9d.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_laplace2 ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_laplace2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpg9kaf_7o.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpg9kaf_7o.cpp'], output_dir = '/tmp/tmpc2af9jc6' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpc2af9jc6/tmp/tmpg9kaf_7o.o', ('/tmp/tmpg9kaf_7o.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpc2af9jc6/tmp/tmpg9kaf_7o.o', '/tmp/tmpg9kaf_7o.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpc2af9jc6/tmp/tmpg9kaf_7o.o', src = '/tmp/tmpg9kaf_7o.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpg9kaf_7o.cpp -o /tmp/tmpc2af9jc6/tmp/tmpg9kaf_7o.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_laplace2', cxxfile = '/tmp/tmpg9kaf_7o.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpn3xq7k7t', buildtmp = '/tmp/tmpc2af9jc6' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_laplace2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpg9kaf_7o.cpp -o /tmp/tmpc2af9jc6/tmp/tmpg9kaf_7o.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_laplace2(self): """Check laplace with shape argument with mean and variance.""" code = """ def numpy_laplace2(size): from numpy.random import laplace from numpy import mean, var from numpy import var, mean, pi u = 0 s = 1 rmean = u rvar = 2*s**2 a = laplace(size=(size, size)) return (abs(mean(a) - rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 3, numpy_laplace2=[int]) pythran/tests/test_numpy_random.py:1522: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_laplace2', cxxfile = '/tmp/tmpg9kaf_7o.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpn3xq7k7t', buildtmp = '/tmp/tmpc2af9jc6' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpg9kaf_7o.cpp -o /tmp/tmpc2af9jc6/tmp/tmpg9kaf_7o.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_laplace2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpc2af9jc6/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpg9kaf_7o.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpg9kaf_7o.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpg9kaf_7o.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_logistic0 _____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logistic0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmppmcqmeie.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmppmcqmeie.cpp'], output_dir = '/tmp/tmpzl9tjs_l' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpzl9tjs_l/tmp/tmppmcqmeie.o', ('/tmp/tmppmcqmeie.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpzl9tjs_l/tmp/tmppmcqmeie.o', '/tmp/tmppmcqmeie.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpzl9tjs_l/tmp/tmppmcqmeie.o', src = '/tmp/tmppmcqmeie.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmppmcqmeie.cpp -o /tmp/tmpzl9tjs_l/tmp/tmppmcqmeie.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_logistic0', cxxfile = '/tmp/tmppmcqmeie.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp1jw7pzjx', buildtmp = '/tmp/tmpzl9tjs_l' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logistic0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmppmcqmeie.cpp -o /tmp/tmpzl9tjs_l/tmp/tmppmcqmeie.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_logistic0(self): """ Check logistic without argument with mean and variance. """ code = """ def numpy_logistic0(size): from numpy.random import logistic from numpy import var, mean, pi u = 0. rmean = u rvar = (pi**2/3) a = [logistic() for x in range(size)] return (abs(mean(a) - rmean) < .1 and abs(var(a) - rvar) < .1) """ > self.run_test(code, 10 ** 6, numpy_logistic0=[int]) pythran/tests/test_numpy_random.py:1380: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_logistic0', cxxfile = '/tmp/tmppmcqmeie.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp1jw7pzjx', buildtmp = '/tmp/tmpzl9tjs_l' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmppmcqmeie.cpp -o /tmp/tmpzl9tjs_l/tmp/tmppmcqmeie.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_logistic0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpzl9tjs_l/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmppmcqmeie.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmppmcqmeie.cpp:27: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmppmcqmeie.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ________________________ TestNumpyRandom.test_numpy_f0a ________________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_f0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpn961c_vk.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpn961c_vk.cpp'], output_dir = '/tmp/tmpkfeu4q4d' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpkfeu4q4d/tmp/tmpn961c_vk.o', ('/tmp/tmpn961c_vk.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpkfeu4q4d/tmp/tmpn961c_vk.o', '/tmp/tmpn961c_vk.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpkfeu4q4d/tmp/tmpn961c_vk.o', src = '/tmp/tmpn961c_vk.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpn961c_vk.cpp -o /tmp/tmpkfeu4q4d/tmp/tmpn961c_vk.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_f0a', cxxfile = '/tmp/tmpn961c_vk.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp7klbt9xb', buildtmp = '/tmp/tmpkfeu4q4d' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_f0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpn961c_vk.cpp -o /tmp/tmpkfeu4q4d/tmp/tmpn961c_vk.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_f0a(self): """ Check f with 2 argument with mean and variance. """ code = """ def numpy_f0a(size): from numpy.random import f from numpy import var, mean dfnum = 50 dfden = 50 rmean = dfden / (dfden - 2) rvar = (2 * dfden**2 *( dfnum + dfden -2))/(dfnum * (dfden -2)**2 * (dfden -4)) a = [f(dfnum, dfden) for x in range(size)] return (abs(mean(a)- rmean) < 0.1 and abs(var(a) - rvar) < .1) """ > self.run_test(code, 10 ** 6, numpy_f0a=[int]) pythran/tests/test_numpy_random.py:1130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_f0a', cxxfile = '/tmp/tmpn961c_vk.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp7klbt9xb', buildtmp = '/tmp/tmpkfeu4q4d' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpn961c_vk.cpp -o /tmp/tmpkfeu4q4d/tmp/tmpn961c_vk.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_f0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpkfeu4q4d/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpn961c_vk.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpn961c_vk.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpn961c_vk.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_rayleigh0 _____________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_rayleigh0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp7m2_qoto.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp7m2_qoto.cpp'], output_dir = '/tmp/tmpe6wggo6x' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpe6wggo6x/tmp/tmp7m2_qoto.o', ('/tmp/tmp7m2_qoto.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpe6wggo6x/tmp/tmp7m2_qoto.o', '/tmp/tmp7m2_qoto.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpe6wggo6x/tmp/tmp7m2_qoto.o', src = '/tmp/tmp7m2_qoto.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp7m2_qoto.cpp -o /tmp/tmpe6wggo6x/tmp/tmp7m2_qoto.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_rayleigh0', cxxfile = '/tmp/tmp7m2_qoto.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp4mx55pr5', buildtmp = '/tmp/tmpe6wggo6x' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_rayleigh0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp7m2_qoto.cpp -o /tmp/tmpe6wggo6x/tmp/tmp7m2_qoto.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_rayleigh0(self): """ Check rayleigh without argument with mean and variance. """ code = """ def numpy_rayleigh0(size): from numpy.random import rayleigh from numpy import var, mean, sqrt, pi a = [rayleigh() for x in range(size)] s = 2 rmean = s*sqrt(pi/2) rvar = ((4-pi)/2)*s**2 return (abs(mean(a)-rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 5, numpy_rayleigh0=[int]) pythran/tests/test_numpy_random.py:1055: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_rayleigh0', cxxfile = '/tmp/tmp7m2_qoto.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp4mx55pr5', buildtmp = '/tmp/tmpe6wggo6x' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp7m2_qoto.cpp -o /tmp/tmpe6wggo6x/tmp/tmp7m2_qoto.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_rayleigh0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpe6wggo6x/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp7m2_qoto.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp7m2_qoto.cpp:27: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp7m2_qoto.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ________________________ TestNumpyRandom.test_numpy_f0b ________________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_f0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpsmi4u3we.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpsmi4u3we.cpp'], output_dir = '/tmp/tmpashihf43' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpashihf43/tmp/tmpsmi4u3we.o', ('/tmp/tmpsmi4u3we.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpashihf43/tmp/tmpsmi4u3we.o', '/tmp/tmpsmi4u3we.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpashihf43/tmp/tmpsmi4u3we.o', src = '/tmp/tmpsmi4u3we.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpsmi4u3we.cpp -o /tmp/tmpashihf43/tmp/tmpsmi4u3we.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_f0b', cxxfile = '/tmp/tmpsmi4u3we.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpk1oqrz8d', buildtmp = '/tmp/tmpashihf43' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_f0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpsmi4u3we.cpp -o /tmp/tmpashihf43/tmp/tmpsmi4u3we.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_f0b(self): """ Check f with 2 argument with mean and variance. """ code = """ def numpy_f0b(size): from numpy.random import f from numpy import var, mean dfnum = 50 dfden = 50 rmean = dfden / (dfden - 2) rvar = (2 * dfden**2 *( dfnum + dfden -2))/(dfnum * (dfden -2)**2 * (dfden -4)) a = f(dfnum, dfden, size) return (abs(mean(a) - rmean) < 0.1 and abs(var(a) - rvar) < .1) """ > self.run_test(code, 10 ** 6, numpy_f0b=[int]) pythran/tests/test_numpy_random.py:1145: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_f0b', cxxfile = '/tmp/tmpsmi4u3we.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpk1oqrz8d', buildtmp = '/tmp/tmpashihf43' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpsmi4u3we.cpp -o /tmp/tmpashihf43/tmp/tmpsmi4u3we.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_f0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpashihf43/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpsmi4u3we.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpsmi4u3we.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpsmi4u3we.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_logistic0a _____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logistic0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp9ul9asjj.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp9ul9asjj.cpp'], output_dir = '/tmp/tmpwsmnkzxy' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpwsmnkzxy/tmp/tmp9ul9asjj.o', ('/tmp/tmp9ul9asjj.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpwsmnkzxy/tmp/tmp9ul9asjj.o', '/tmp/tmp9ul9asjj.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpwsmnkzxy/tmp/tmp9ul9asjj.o', src = '/tmp/tmp9ul9asjj.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9ul9asjj.cpp -o /tmp/tmpwsmnkzxy/tmp/tmp9ul9asjj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_logistic0a', cxxfile = '/tmp/tmp9ul9asjj.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpr2gwh1lq', buildtmp = '/tmp/tmpwsmnkzxy' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logistic0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9ul9asjj.cpp -o /tmp/tmpwsmnkzxy/tmp/tmp9ul9asjj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_logistic0a(self): """ Check logistic with 1 argument with mean and variance. """ code = """ def numpy_logistic0a(size): from numpy.random import logistic from numpy import var, mean, pi u = 2 rmean = u rvar = (pi**2/3) a = [logistic(u) for x in range(size)] return (abs(mean(a) - rmean ) < 0.1 and abs(var(a) - rvar) < .1) """ > self.run_test(code, 10 ** 6, numpy_logistic0a=[int]) pythran/tests/test_numpy_random.py:1394: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_logistic0a', cxxfile = '/tmp/tmp9ul9asjj.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpr2gwh1lq', buildtmp = '/tmp/tmpwsmnkzxy' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9ul9asjj.cpp -o /tmp/tmpwsmnkzxy/tmp/tmp9ul9asjj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_logistic0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpwsmnkzxy/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp9ul9asjj.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp9ul9asjj.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp9ul9asjj.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_rayleigh0a _____________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_rayleigh0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpjb8ngsj6.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpjb8ngsj6.cpp'], output_dir = '/tmp/tmpnwt_o72w' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpnwt_o72w/tmp/tmpjb8ngsj6.o', ('/tmp/tmpjb8ngsj6.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpnwt_o72w/tmp/tmpjb8ngsj6.o', '/tmp/tmpjb8ngsj6.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpnwt_o72w/tmp/tmpjb8ngsj6.o', src = '/tmp/tmpjb8ngsj6.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpjb8ngsj6.cpp -o /tmp/tmpnwt_o72w/tmp/tmpjb8ngsj6.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_rayleigh0a', cxxfile = '/tmp/tmpjb8ngsj6.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpzuu8i165', buildtmp = '/tmp/tmpnwt_o72w' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_rayleigh0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpjb8ngsj6.cpp -o /tmp/tmpnwt_o72w/tmp/tmpjb8ngsj6.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_rayleigh0a(self): """ Check rayleigh with 1 argument with mean and variance. """ code = """ def numpy_rayleigh0a(size): from numpy.random import rayleigh from numpy import var, mean, sqrt, pi s = 2 a = [rayleigh(s) for x in range(size)] rmean = s*sqrt(pi/2) rvar = ((4-pi)/2)*s**2 return (abs(mean(a)-rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 5, numpy_rayleigh0a=[int]) pythran/tests/test_numpy_random.py:1069: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_rayleigh0a', cxxfile = '/tmp/tmpjb8ngsj6.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpzuu8i165', buildtmp = '/tmp/tmpnwt_o72w' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpjb8ngsj6.cpp -o /tmp/tmpnwt_o72w/tmp/tmpjb8ngsj6.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_rayleigh0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpnwt_o72w/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpjb8ngsj6.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpjb8ngsj6.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpjb8ngsj6.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ________________________ TestNumpyRandom.test_numpy_f2 _________________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_f2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp6c_imgh2.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp6c_imgh2.cpp'], output_dir = '/tmp/tmpdglk6fof' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpdglk6fof/tmp/tmp6c_imgh2.o', ('/tmp/tmp6c_imgh2.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpdglk6fof/tmp/tmp6c_imgh2.o', '/tmp/tmp6c_imgh2.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpdglk6fof/tmp/tmp6c_imgh2.o', src = '/tmp/tmp6c_imgh2.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp6c_imgh2.cpp -o /tmp/tmpdglk6fof/tmp/tmp6c_imgh2.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_f2', cxxfile = '/tmp/tmp6c_imgh2.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmptlkddnkx', buildtmp = '/tmp/tmpdglk6fof' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_f2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp6c_imgh2.cpp -o /tmp/tmpdglk6fof/tmp/tmp6c_imgh2.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_f2(self): """Check f with shape argument with mean and variance.""" code = """ def numpy_f2(size): from numpy.random import f from numpy import mean, var dfnum = 50 dfden = 50 rmean = dfden / (dfden - 2) rvar = (2 * dfden**2 *( dfnum + dfden -2))/(dfnum * (dfden -2)**2 * (dfden -4)) a = f(dfnum, dfden, size=(size, size)) return (abs(mean(a) - rmean) < .1 and abs(var(a) - rvar) < .1) """ > self.run_test(code, 10 ** 3, numpy_f2=[int]) pythran/tests/test_numpy_random.py:1160: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_f2', cxxfile = '/tmp/tmp6c_imgh2.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmptlkddnkx', buildtmp = '/tmp/tmpdglk6fof' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp6c_imgh2.cpp -o /tmp/tmpdglk6fof/tmp/tmp6c_imgh2.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_f2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpdglk6fof/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp6c_imgh2.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp6c_imgh2.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp6c_imgh2.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_logistic0b _____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logistic0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp5mj7kflk.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp5mj7kflk.cpp'], output_dir = '/tmp/tmpyqrnwcdc' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpyqrnwcdc/tmp/tmp5mj7kflk.o', ('/tmp/tmp5mj7kflk.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpyqrnwcdc/tmp/tmp5mj7kflk.o', '/tmp/tmp5mj7kflk.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpyqrnwcdc/tmp/tmp5mj7kflk.o', src = '/tmp/tmp5mj7kflk.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp5mj7kflk.cpp -o /tmp/tmpyqrnwcdc/tmp/tmp5mj7kflk.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_logistic0b', cxxfile = '/tmp/tmp5mj7kflk.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpc79zls8w', buildtmp = '/tmp/tmpyqrnwcdc' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logistic0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp5mj7kflk.cpp -o /tmp/tmpyqrnwcdc/tmp/tmp5mj7kflk.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_logistic0b(self): """ Check logistic with 2 argument with mean and variance. """ code = """ def numpy_logistic0b(size): from numpy.random import logistic from numpy import var, mean, pi u = 2. s = 2 rmean = u rvar = ((s**2*pi**2)/3) a = logistic(u, s, size) return (abs(mean(a) - rmean) < 0.1 and abs(var(a) - rvar) < .1) """ > self.run_test(code, 10 ** 6, numpy_logistic0b=[int]) pythran/tests/test_numpy_random.py:1409: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_logistic0b', cxxfile = '/tmp/tmp5mj7kflk.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpc79zls8w', buildtmp = '/tmp/tmpyqrnwcdc' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp5mj7kflk.cpp -o /tmp/tmpyqrnwcdc/tmp/tmp5mj7kflk.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_logistic0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpyqrnwcdc/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp5mj7kflk.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp5mj7kflk.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp5mj7kflk.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_rayleigh0b _____________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_rayleigh0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpczocftq9.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpczocftq9.cpp'], output_dir = '/tmp/tmp0v_7fz09' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp0v_7fz09/tmp/tmpczocftq9.o', ('/tmp/tmpczocftq9.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp0v_7fz09/tmp/tmpczocftq9.o', '/tmp/tmpczocftq9.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp0v_7fz09/tmp/tmpczocftq9.o', src = '/tmp/tmpczocftq9.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpczocftq9.cpp -o /tmp/tmp0v_7fz09/tmp/tmpczocftq9.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_rayleigh0b', cxxfile = '/tmp/tmpczocftq9.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp4kqa7rwy', buildtmp = '/tmp/tmp0v_7fz09' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_rayleigh0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpczocftq9.cpp -o /tmp/tmp0v_7fz09/tmp/tmpczocftq9.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_rayleigh0b(self): """ Check rayleigh with 2 argument with mean and variance. """ code = """ def numpy_rayleigh0b(size): from numpy.random import rayleigh from numpy import var, mean, sqrt, pi s = 2 a = rayleigh(s, size) rmean = s*sqrt(pi/2) rvar = ((4-pi)/2)*s**2 return (abs(mean(a)-rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 5, numpy_rayleigh0b=[int]) pythran/tests/test_numpy_random.py:1083: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_rayleigh0b', cxxfile = '/tmp/tmpczocftq9.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp4kqa7rwy', buildtmp = '/tmp/tmp0v_7fz09' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpczocftq9.cpp -o /tmp/tmp0v_7fz09/tmp/tmpczocftq9.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_rayleigh0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp0v_7fz09/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpczocftq9.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpczocftq9.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpczocftq9.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_gamma0a ______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gamma0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpibp4zo0i.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpibp4zo0i.cpp'], output_dir = '/tmp/tmpa_xpoqgc' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpa_xpoqgc/tmp/tmpibp4zo0i.o', ('/tmp/tmpibp4zo0i.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpa_xpoqgc/tmp/tmpibp4zo0i.o', '/tmp/tmpibp4zo0i.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpa_xpoqgc/tmp/tmpibp4zo0i.o', src = '/tmp/tmpibp4zo0i.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpibp4zo0i.cpp -o /tmp/tmpa_xpoqgc/tmp/tmpibp4zo0i.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_gamma0a', cxxfile = '/tmp/tmpibp4zo0i.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpdv21ke57', buildtmp = '/tmp/tmpa_xpoqgc' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gamma0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpibp4zo0i.cpp -o /tmp/tmpa_xpoqgc/tmp/tmpibp4zo0i.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_gamma0a(self): """ Check gamma with 1 argument with mean and variance. """ code = """ def numpy_gamma0a(size): from numpy.random import gamma from numpy import var, mean shape = 1 a = [gamma(3.) for x in range(size)] return (abs(mean(a)- shape) < 0.1 and abs(var(a) - shape) < .1) """ > self.run_test(code, 10 ** 6, numpy_gamma0a=[int]) pythran/tests/test_numpy_random.py:762: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_gamma0a', cxxfile = '/tmp/tmpibp4zo0i.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpdv21ke57', buildtmp = '/tmp/tmpa_xpoqgc' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpibp4zo0i.cpp -o /tmp/tmpa_xpoqgc/tmp/tmpibp4zo0i.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_gamma0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpa_xpoqgc/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpibp4zo0i.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpibp4zo0i.cpp:27: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpibp4zo0i.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_logistic1 _____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logistic1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpdxawr15q.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpdxawr15q.cpp'], output_dir = '/tmp/tmpezafvx71' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpezafvx71/tmp/tmpdxawr15q.o', ('/tmp/tmpdxawr15q.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpezafvx71/tmp/tmpdxawr15q.o', '/tmp/tmpdxawr15q.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpezafvx71/tmp/tmpdxawr15q.o', src = '/tmp/tmpdxawr15q.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpdxawr15q.cpp -o /tmp/tmpezafvx71/tmp/tmpdxawr15q.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_logistic1', cxxfile = '/tmp/tmpdxawr15q.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpmihnjk08', buildtmp = '/tmp/tmpezafvx71' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logistic1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpdxawr15q.cpp -o /tmp/tmpezafvx71/tmp/tmpdxawr15q.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_logistic1(self): """ Check logistic with size argument with mean and variance.""" code = """ def numpy_logistic1(size): from numpy.random import logistic from numpy import var, mean from numpy import var, mean, pi u = 0. s = 1 rmean = u rvar = ((s**2*pi**2)/3) a = logistic(size=size) return (abs(mean(a) - rmean) < .1 and abs(var(a) - rvar) < .1) """ > self.run_test(code, 10 ** 6, numpy_logistic1=[int]) pythran/tests/test_numpy_random.py:1425: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_logistic1', cxxfile = '/tmp/tmpdxawr15q.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpmihnjk08', buildtmp = '/tmp/tmpezafvx71' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpdxawr15q.cpp -o /tmp/tmpezafvx71/tmp/tmpdxawr15q.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_logistic1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpezafvx71/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpdxawr15q.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpdxawr15q.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpdxawr15q.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_rayleigh1 _____________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_rayleigh1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmplwxkcjei.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmplwxkcjei.cpp'], output_dir = '/tmp/tmpnl2umcls' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpnl2umcls/tmp/tmplwxkcjei.o', ('/tmp/tmplwxkcjei.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpnl2umcls/tmp/tmplwxkcjei.o', '/tmp/tmplwxkcjei.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpnl2umcls/tmp/tmplwxkcjei.o', src = '/tmp/tmplwxkcjei.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmplwxkcjei.cpp -o /tmp/tmpnl2umcls/tmp/tmplwxkcjei.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_rayleigh1', cxxfile = '/tmp/tmplwxkcjei.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmplsyj3lzu', buildtmp = '/tmp/tmpnl2umcls' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_rayleigh1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmplwxkcjei.cpp -o /tmp/tmpnl2umcls/tmp/tmplwxkcjei.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_rayleigh1(self): """ Check rayleigh with size argument with mean and variance.""" code = """ def numpy_rayleigh1(size): from numpy.random import rayleigh from numpy import var, mean, sqrt, pi a = rayleigh(size=size) s = 2 rmean = s*sqrt(pi/2) rvar = ((4-pi)/2)*s**2 return (abs(mean(a)-rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 5, numpy_rayleigh1=[int]) pythran/tests/test_numpy_random.py:1097: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_rayleigh1', cxxfile = '/tmp/tmplwxkcjei.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmplsyj3lzu', buildtmp = '/tmp/tmpnl2umcls' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmplwxkcjei.cpp -o /tmp/tmpnl2umcls/tmp/tmplwxkcjei.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_rayleigh1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpnl2umcls/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmplwxkcjei.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmplwxkcjei.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmplwxkcjei.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_logistic2 _____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logistic2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpju2zjcgk.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpju2zjcgk.cpp'], output_dir = '/tmp/tmps2k533g9' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmps2k533g9/tmp/tmpju2zjcgk.o', ('/tmp/tmpju2zjcgk.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmps2k533g9/tmp/tmpju2zjcgk.o', '/tmp/tmpju2zjcgk.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmps2k533g9/tmp/tmpju2zjcgk.o', src = '/tmp/tmpju2zjcgk.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpju2zjcgk.cpp -o /tmp/tmps2k533g9/tmp/tmpju2zjcgk.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_logistic2', cxxfile = '/tmp/tmpju2zjcgk.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpquk5rscg', buildtmp = '/tmp/tmps2k533g9' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logistic2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpju2zjcgk.cpp -o /tmp/tmps2k533g9/tmp/tmpju2zjcgk.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_logistic2(self): """Check logistic with shape argument with mean and variance.""" code = """ def numpy_logistic2(size): from numpy.random import logistic from numpy import mean, var from numpy import var, mean, pi u = 0 s = 1 rmean = u rvar = ((s**2*pi**2)/3) a = logistic(size=(size, size)) return (abs(mean(a) - rmean) < .1 and abs(var(a) - rvar) < .1) """ > self.run_test(code, 10 ** 3, numpy_logistic2=[int]) pythran/tests/test_numpy_random.py:1441: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_logistic2', cxxfile = '/tmp/tmpju2zjcgk.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpquk5rscg', buildtmp = '/tmp/tmps2k533g9' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpju2zjcgk.cpp -o /tmp/tmps2k533g9/tmp/tmpju2zjcgk.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_logistic2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmps2k533g9/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpju2zjcgk.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpju2zjcgk.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpju2zjcgk.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_gamma0b ______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gamma0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp_r07tdd9.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp_r07tdd9.cpp'], output_dir = '/tmp/tmp4bh06xjp' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp4bh06xjp/tmp/tmp_r07tdd9.o', ('/tmp/tmp_r07tdd9.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp4bh06xjp/tmp/tmp_r07tdd9.o', '/tmp/tmp_r07tdd9.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp4bh06xjp/tmp/tmp_r07tdd9.o', src = '/tmp/tmp_r07tdd9.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp_r07tdd9.cpp -o /tmp/tmp4bh06xjp/tmp/tmp_r07tdd9.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_gamma0b', cxxfile = '/tmp/tmp_r07tdd9.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpx_q3gkd_', buildtmp = '/tmp/tmp4bh06xjp' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gamma0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp_r07tdd9.cpp -o /tmp/tmp4bh06xjp/tmp/tmp_r07tdd9.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_gamma0b(self): """ Check gamma with 2 argument with mean and variance. """ code = """ def numpy_gamma0b(size): from numpy.random import gamma from numpy import var, mean, sqrt shape, scale = 1,2 a = gamma(shape, scale, size) return (abs(mean(a) - shape*scale) < 0.05 and abs(var(a) - shape*scale**2) < .05) """ > self.run_test(code, 10 ** 6, numpy_gamma0b=[int]) pythran/tests/test_numpy_random.py:774: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_gamma0b', cxxfile = '/tmp/tmp_r07tdd9.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpx_q3gkd_', buildtmp = '/tmp/tmp4bh06xjp' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp_r07tdd9.cpp -o /tmp/tmp4bh06xjp/tmp/tmp_r07tdd9.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_gamma0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp4bh06xjp/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp_r07tdd9.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp_r07tdd9.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp_r07tdd9.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_rayleigh2 _____________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_rayleigh2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpmi674uzw.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpmi674uzw.cpp'], output_dir = '/tmp/tmpnsqn5f3m' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpnsqn5f3m/tmp/tmpmi674uzw.o', ('/tmp/tmpmi674uzw.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpnsqn5f3m/tmp/tmpmi674uzw.o', '/tmp/tmpmi674uzw.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpnsqn5f3m/tmp/tmpmi674uzw.o', src = '/tmp/tmpmi674uzw.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpmi674uzw.cpp -o /tmp/tmpnsqn5f3m/tmp/tmpmi674uzw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_rayleigh2', cxxfile = '/tmp/tmpmi674uzw.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpho7m1cbf', buildtmp = '/tmp/tmpnsqn5f3m' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_rayleigh2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpmi674uzw.cpp -o /tmp/tmpnsqn5f3m/tmp/tmpmi674uzw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_rayleigh2(self): """Check rayleigh with shape argument with mean and variance.""" code = """ def numpy_rayleigh2(size): from numpy.random import rayleigh from numpy import mean, var, sqrt, pi a = rayleigh(size=(size, size)) s = 2 rmean = s*sqrt(pi/2) rvar = ((4-pi)/2)*s**2 return (abs(mean(a)-rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 3, numpy_rayleigh2=[int]) pythran/tests/test_numpy_random.py:1111: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_rayleigh2', cxxfile = '/tmp/tmpmi674uzw.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpho7m1cbf', buildtmp = '/tmp/tmpnsqn5f3m' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpmi674uzw.cpp -o /tmp/tmpnsqn5f3m/tmp/tmpmi674uzw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_rayleigh2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpnsqn5f3m/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpmi674uzw.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpmi674uzw.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpmi674uzw.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_gamma2 _______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gamma2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp50s6fnw_.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp50s6fnw_.cpp'], output_dir = '/tmp/tmpb39wt9pr' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpb39wt9pr/tmp/tmp50s6fnw_.o', ('/tmp/tmp50s6fnw_.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpb39wt9pr/tmp/tmp50s6fnw_.o', '/tmp/tmp50s6fnw_.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpb39wt9pr/tmp/tmp50s6fnw_.o', src = '/tmp/tmp50s6fnw_.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp50s6fnw_.cpp -o /tmp/tmpb39wt9pr/tmp/tmp50s6fnw_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_gamma2', cxxfile = '/tmp/tmp50s6fnw_.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpj5lrmjzg', buildtmp = '/tmp/tmpb39wt9pr' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gamma2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp50s6fnw_.cpp -o /tmp/tmpb39wt9pr/tmp/tmp50s6fnw_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_gamma2(self): """Check gamma with shape argument with mean and variance.""" code = """ def numpy_gamma2(size): from numpy.random import gamma from numpy import mean, var shape = 2 a = gamma(shape = shape, size=(size, size)) return (abs(mean(a) - shape) < .05 and abs(var(a) - shape) < .05) """ > self.run_test(code, 10 ** 3, numpy_gamma2=[int]) pythran/tests/test_numpy_random.py:786: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_gamma2', cxxfile = '/tmp/tmp50s6fnw_.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpj5lrmjzg', buildtmp = '/tmp/tmpb39wt9pr' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp50s6fnw_.cpp -o /tmp/tmpb39wt9pr/tmp/tmp50s6fnw_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_gamma2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpb39wt9pr/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp50s6fnw_.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp50s6fnw_.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp50s6fnw_.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_lognormal0 _____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_lognormal0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp6h7yi30_.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp6h7yi30_.cpp'], output_dir = '/tmp/tmp0zovb3os' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp0zovb3os/tmp/tmp6h7yi30_.o', ('/tmp/tmp6h7yi30_.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp0zovb3os/tmp/tmp6h7yi30_.o', '/tmp/tmp6h7yi30_.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp0zovb3os/tmp/tmp6h7yi30_.o', src = '/tmp/tmp6h7yi30_.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp6h7yi30_.cpp -o /tmp/tmp0zovb3os/tmp/tmp6h7yi30_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_lognormal0', cxxfile = '/tmp/tmp6h7yi30_.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpgivwfubq', buildtmp = '/tmp/tmp0zovb3os' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_lognormal0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp6h7yi30_.cpp -o /tmp/tmp0zovb3os/tmp/tmp6h7yi30_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_lognormal0(self): """ Check lognormal without argument with mean and variance. """ code = """ def numpy_lognormal0(size): from numpy.random import lognormal from numpy import var, mean, e a = [lognormal() for x in range(size)] m = 0 s = 1/2 rmean = e**(m+(s**2/2)) rvar = (e**(s**2) - 1)*e**(2*m+s**2) return (abs(mean(a) - rmean) < .1 and abs(var(a) - rvar) < .1) """ > self.run_test(code, 10 ** 6, numpy_lognormal0=[int]) pythran/tests/test_numpy_random.py:845: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_lognormal0', cxxfile = '/tmp/tmp6h7yi30_.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpgivwfubq', buildtmp = '/tmp/tmp0zovb3os' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp6h7yi30_.cpp -o /tmp/tmp0zovb3os/tmp/tmp6h7yi30_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_lognormal0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp0zovb3os/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp6h7yi30_.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp6h7yi30_.cpp:27: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp6h7yi30_.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_geometric0a ____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_geometric0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpaadz2ihm.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpaadz2ihm.cpp'], output_dir = '/tmp/tmplimm81sz' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmplimm81sz/tmp/tmpaadz2ihm.o', ('/tmp/tmpaadz2ihm.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmplimm81sz/tmp/tmpaadz2ihm.o', '/tmp/tmpaadz2ihm.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmplimm81sz/tmp/tmpaadz2ihm.o', src = '/tmp/tmpaadz2ihm.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpaadz2ihm.cpp -o /tmp/tmplimm81sz/tmp/tmpaadz2ihm.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_geometric0a', cxxfile = '/tmp/tmpaadz2ihm.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpg9e1nbin', buildtmp = '/tmp/tmplimm81sz' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_geometric0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpaadz2ihm.cpp -o /tmp/tmplimm81sz/tmp/tmpaadz2ihm.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_geometric0a(self): """ Check geometric with 1 argument with mean and variance. """ code = """ def numpy_geometric0a(size): from numpy.random import geometric from numpy import var, mean a = [geometric(0.6) for x in range(size)] return (abs(mean(a)- 2) < .05 and abs(var(a) - 3) < 1/8) """ > self.run_test(code, 10 ** 6, numpy_geometric0a=[int]) pythran/tests/test_numpy_random.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_geometric0a', cxxfile = '/tmp/tmpaadz2ihm.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpg9e1nbin', buildtmp = '/tmp/tmplimm81sz' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpaadz2ihm.cpp -o /tmp/tmplimm81sz/tmp/tmpaadz2ihm.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_geometric0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmplimm81sz/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpaadz2ihm.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpaadz2ihm.cpp:27: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpaadz2ihm.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_geometric0b ____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_geometric0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpx2mkg5a5.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpx2mkg5a5.cpp'], output_dir = '/tmp/tmpe7unbtho' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpe7unbtho/tmp/tmpx2mkg5a5.o', ('/tmp/tmpx2mkg5a5.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpe7unbtho/tmp/tmpx2mkg5a5.o', '/tmp/tmpx2mkg5a5.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpe7unbtho/tmp/tmpx2mkg5a5.o', src = '/tmp/tmpx2mkg5a5.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpx2mkg5a5.cpp -o /tmp/tmpe7unbtho/tmp/tmpx2mkg5a5.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_geometric0b', cxxfile = '/tmp/tmpx2mkg5a5.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpj62hs0eg', buildtmp = '/tmp/tmpe7unbtho' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_geometric0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpx2mkg5a5.cpp -o /tmp/tmpe7unbtho/tmp/tmpx2mkg5a5.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_geometric0b(self): """ Check geometric with 2 argument with mean and variance. """ code = """ def numpy_geometric0b(size): from numpy.random import geometric from numpy import var, mean, sqrt p = 0.25 a = geometric(p, size) return (abs(mean(a)- 4) < 0.05 and abs(sqrt(p) - sqrt(var(a,ddof=1))) < 1/64) """ > self.run_test(code, 10 ** 6, numpy_geometric0b=[int]) pythran/tests/test_numpy_random.py:932: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_geometric0b', cxxfile = '/tmp/tmpx2mkg5a5.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpj62hs0eg', buildtmp = '/tmp/tmpe7unbtho' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpx2mkg5a5.cpp -o /tmp/tmpe7unbtho/tmp/tmpx2mkg5a5.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_geometric0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpe7unbtho/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpx2mkg5a5.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpx2mkg5a5.cpp:25: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpx2mkg5a5.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_lognormal0a ____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_lognormal0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp4er3oknx.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp4er3oknx.cpp'], output_dir = '/tmp/tmpteyb_mzn' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpteyb_mzn/tmp/tmp4er3oknx.o', ('/tmp/tmp4er3oknx.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpteyb_mzn/tmp/tmp4er3oknx.o', '/tmp/tmp4er3oknx.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpteyb_mzn/tmp/tmp4er3oknx.o', src = '/tmp/tmp4er3oknx.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp4er3oknx.cpp -o /tmp/tmpteyb_mzn/tmp/tmp4er3oknx.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_lognormal0a', cxxfile = '/tmp/tmp4er3oknx.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp1afbp17r', buildtmp = '/tmp/tmpteyb_mzn' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_lognormal0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp4er3oknx.cpp -o /tmp/tmpteyb_mzn/tmp/tmp4er3oknx.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_lognormal0a(self): """ Check lognormal with 1 argument with mean and variance. """ code = """ def numpy_lognormal0a(size): from numpy.random import lognormal from numpy import var, mean, e m = 0 s = 1/5 a = [lognormal(m) for x in range(size)] rmean = e**(m+(s**2/2)) rvar = (e**(s**2) - 1)*e**(2*m+s**2) return (abs(mean(a)- rmean) < 0.1 and abs(var(a) - rvar) < .1) """ > self.run_test(code, 10 ** 6, numpy_lognormal0a=[int]) pythran/tests/test_numpy_random.py:860: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_lognormal0a', cxxfile = '/tmp/tmp4er3oknx.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp1afbp17r', buildtmp = '/tmp/tmpteyb_mzn' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp4er3oknx.cpp -o /tmp/tmpteyb_mzn/tmp/tmp4er3oknx.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_lognormal0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpteyb_mzn/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp4er3oknx.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp4er3oknx.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp4er3oknx.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_geometric2 _____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_geometric2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpa4sshnk4.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpa4sshnk4.cpp'], output_dir = '/tmp/tmpvk8hkeom' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpvk8hkeom/tmp/tmpa4sshnk4.o', ('/tmp/tmpa4sshnk4.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpvk8hkeom/tmp/tmpa4sshnk4.o', '/tmp/tmpa4sshnk4.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpvk8hkeom/tmp/tmpa4sshnk4.o', src = '/tmp/tmpa4sshnk4.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpa4sshnk4.cpp -o /tmp/tmpvk8hkeom/tmp/tmpa4sshnk4.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_geometric2', cxxfile = '/tmp/tmpa4sshnk4.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpy_h7nvmw', buildtmp = '/tmp/tmpvk8hkeom' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_geometric2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpa4sshnk4.cpp -o /tmp/tmpvk8hkeom/tmp/tmpa4sshnk4.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_geometric2(self): """Check geometric with shape argument with mean and variance.""" code = """ def numpy_geometric2(size): from numpy.random import geometric from numpy import mean, var p = 0.5 a = geometric(p, size=(size, size)) return (abs(mean(a)-2) < .05 and abs(var(a) - 1) < 1/8) """ > self.run_test(code, 10 ** 3, numpy_geometric2=[int]) pythran/tests/test_numpy_random.py:947: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_geometric2', cxxfile = '/tmp/tmpa4sshnk4.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpy_h7nvmw', buildtmp = '/tmp/tmpvk8hkeom' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpa4sshnk4.cpp -o /tmp/tmpvk8hkeom/tmp/tmpa4sshnk4.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_geometric2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpvk8hkeom/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpa4sshnk4.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpa4sshnk4.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpa4sshnk4.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________ TestNumpyRandom.test_numpy_standard_exponential0 _______________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_exponential0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpw7lbbn_q.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpw7lbbn_q.cpp'], output_dir = '/tmp/tmpwvrw7y92' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpwvrw7y92/tmp/tmpw7lbbn_q.o', ('/tmp/tmpw7lbbn_q.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpwvrw7y92/tmp/tmpw7lbbn_q.o', '/tmp/tmpw7lbbn_q.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpwvrw7y92/tmp/tmpw7lbbn_q.o', src = '/tmp/tmpw7lbbn_q.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpw7lbbn_q.cpp -o /tmp/tmpwvrw7y92/tmp/tmpw7lbbn_q.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_standard_exponential0' cxxfile = '/tmp/tmpw7lbbn_q.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpyuwvhggb', buildtmp = '/tmp/tmpwvrw7y92' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_exponential0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpw7lbbn_q.cpp -o /tmp/tmpwvrw7y92/tmp/tmpw7lbbn_q.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_standard_exponential0(self): """ Check standard_exponential without argument with mean and variance. """ code = """ def numpy_standard_exponential0(size): from numpy.random import standard_exponential from numpy import var, mean a = [standard_exponential() for x in range(size)] return (abs(mean(a) - 1) < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 5, numpy_standard_exponential0=[int]) pythran/tests/test_numpy_random.py:1224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_standard_exponential0' cxxfile = '/tmp/tmpw7lbbn_q.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpyuwvhggb', buildtmp = '/tmp/tmpwvrw7y92' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpw7lbbn_q.cpp -o /tmp/tmpwvrw7y92/tmp/tmpw7lbbn_q.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_standard_exponential0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpwvrw7y92/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpw7lbbn_q.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpw7lbbn_q.cpp:27: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpw7lbbn_q.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_lognormal0b ____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_lognormal0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpdsvkcm8g.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpdsvkcm8g.cpp'], output_dir = '/tmp/tmpcdqgzqv6' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpcdqgzqv6/tmp/tmpdsvkcm8g.o', ('/tmp/tmpdsvkcm8g.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpcdqgzqv6/tmp/tmpdsvkcm8g.o', '/tmp/tmpdsvkcm8g.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpcdqgzqv6/tmp/tmpdsvkcm8g.o', src = '/tmp/tmpdsvkcm8g.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpdsvkcm8g.cpp -o /tmp/tmpcdqgzqv6/tmp/tmpdsvkcm8g.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_lognormal0b', cxxfile = '/tmp/tmpdsvkcm8g.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppik6tfuk', buildtmp = '/tmp/tmpcdqgzqv6' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_lognormal0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpdsvkcm8g.cpp -o /tmp/tmpcdqgzqv6/tmp/tmpdsvkcm8g.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_lognormal0b(self): """ Check lognormal with 2 argument with mean and variance. """ code = """ def numpy_lognormal0b(size): from numpy.random import lognormal from numpy import var, mean, e m = 1 s = 1/8 a = lognormal(m, s, size) rmean = e**(m+(s**2/2)) rvar = (e**(s**2) - 1)*e**(2*m+s**2) return (abs(mean(a) - rmean) < 0.1 and abs(var(a) - rvar) < .1) """ > self.run_test(code, 10 ** 6, numpy_lognormal0b=[int]) pythran/tests/test_numpy_random.py:875: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_lognormal0b', cxxfile = '/tmp/tmpdsvkcm8g.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppik6tfuk', buildtmp = '/tmp/tmpcdqgzqv6' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpdsvkcm8g.cpp -o /tmp/tmpcdqgzqv6/tmp/tmpdsvkcm8g.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_lognormal0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpcdqgzqv6/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpdsvkcm8g.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpdsvkcm8g.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpdsvkcm8g.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________ TestNumpyRandom.test_numpy_standard_exponential1 _______________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_exponential1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpyn86xtcq.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpyn86xtcq.cpp'], output_dir = '/tmp/tmpjuesam9j' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpjuesam9j/tmp/tmpyn86xtcq.o', ('/tmp/tmpyn86xtcq.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpjuesam9j/tmp/tmpyn86xtcq.o', '/tmp/tmpyn86xtcq.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpjuesam9j/tmp/tmpyn86xtcq.o', src = '/tmp/tmpyn86xtcq.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpyn86xtcq.cpp -o /tmp/tmpjuesam9j/tmp/tmpyn86xtcq.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_standard_exponential1' cxxfile = '/tmp/tmpyn86xtcq.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpu3m23jun', buildtmp = '/tmp/tmpjuesam9j' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_exponential1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpyn86xtcq.cpp -o /tmp/tmpjuesam9j/tmp/tmpyn86xtcq.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_standard_exponential1(self): """ Check standard_exponential with size argument with mean and variance.""" code = """ def numpy_standard_exponential1(size): from numpy.random import standard_exponential from numpy import var, mean a = standard_exponential(size) return (abs(mean(a) - 1) < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 5, numpy_standard_exponential1=[int]) pythran/tests/test_numpy_random.py:1235: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_standard_exponential1' cxxfile = '/tmp/tmpyn86xtcq.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpu3m23jun', buildtmp = '/tmp/tmpjuesam9j' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpyn86xtcq.cpp -o /tmp/tmpjuesam9j/tmp/tmpyn86xtcq.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_standard_exponential1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpjuesam9j/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpyn86xtcq.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpyn86xtcq.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpyn86xtcq.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_gumbel0 ______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gumbel0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmprzui94or.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmprzui94or.cpp'], output_dir = '/tmp/tmpt1hvdbwr' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpt1hvdbwr/tmp/tmprzui94or.o', ('/tmp/tmprzui94or.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpt1hvdbwr/tmp/tmprzui94or.o', '/tmp/tmprzui94or.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpt1hvdbwr/tmp/tmprzui94or.o', src = '/tmp/tmprzui94or.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmprzui94or.cpp -o /tmp/tmpt1hvdbwr/tmp/tmprzui94or.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_gumbel0', cxxfile = '/tmp/tmprzui94or.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmptgcsmmml', buildtmp = '/tmp/tmpt1hvdbwr' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gumbel0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmprzui94or.cpp -o /tmp/tmpt1hvdbwr/tmp/tmprzui94or.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_gumbel0(self): """ Check gumbel without argument with mean and variance. """ code = """ def numpy_gumbel0(size): from numpy.random import gumbel from numpy import var, mean, pi u = 0. rmean = u + 0.57721 rvar = (pi**2/6) a = [gumbel() for x in range(size)] return (abs(mean(a) - rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 6, numpy_gumbel0=[int]) pythran/tests/test_numpy_random.py:1301: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_gumbel0', cxxfile = '/tmp/tmprzui94or.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmptgcsmmml', buildtmp = '/tmp/tmpt1hvdbwr' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmprzui94or.cpp -o /tmp/tmpt1hvdbwr/tmp/tmprzui94or.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_gumbel0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpt1hvdbwr/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmprzui94or.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmprzui94or.cpp:27: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmprzui94or.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________ TestNumpyRandom.test_numpy_standard_exponential2 _______________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_exponential2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp8mpmn803.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp8mpmn803.cpp'], output_dir = '/tmp/tmp_j__awsx' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp_j__awsx/tmp/tmp8mpmn803.o', ('/tmp/tmp8mpmn803.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp_j__awsx/tmp/tmp8mpmn803.o', '/tmp/tmp8mpmn803.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp_j__awsx/tmp/tmp8mpmn803.o', src = '/tmp/tmp8mpmn803.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp8mpmn803.cpp -o /tmp/tmp_j__awsx/tmp/tmp8mpmn803.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_standard_exponential2' cxxfile = '/tmp/tmp8mpmn803.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp8y8ct_rf', buildtmp = '/tmp/tmp_j__awsx' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_exponential2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp8mpmn803.cpp -o /tmp/tmp_j__awsx/tmp/tmp8mpmn803.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_standard_exponential2(self): """Check standard_exponential with shape argument with mean and variance.""" code = """ def numpy_standard_exponential2(size): from numpy.random import standard_exponential from numpy import mean, var a = standard_exponential((size, size)) return (abs(mean(a) - 1) < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 3, numpy_standard_exponential2=[int]) pythran/tests/test_numpy_random.py:1246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_standard_exponential2' cxxfile = '/tmp/tmp8mpmn803.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp8y8ct_rf', buildtmp = '/tmp/tmp_j__awsx' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp8mpmn803.cpp -o /tmp/tmp_j__awsx/tmp/tmp8mpmn803.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_standard_exponential2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp_j__awsx/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp8mpmn803.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp8mpmn803.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp8mpmn803.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_lognormal1 _____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_lognormal1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp9yxyfcpq.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp9yxyfcpq.cpp'], output_dir = '/tmp/tmpy698rri_' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpy698rri_/tmp/tmp9yxyfcpq.o', ('/tmp/tmp9yxyfcpq.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpy698rri_/tmp/tmp9yxyfcpq.o', '/tmp/tmp9yxyfcpq.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpy698rri_/tmp/tmp9yxyfcpq.o', src = '/tmp/tmp9yxyfcpq.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9yxyfcpq.cpp -o /tmp/tmpy698rri_/tmp/tmp9yxyfcpq.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_lognormal1', cxxfile = '/tmp/tmp9yxyfcpq.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpljsa46t1', buildtmp = '/tmp/tmpy698rri_' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_lognormal1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9yxyfcpq.cpp -o /tmp/tmpy698rri_/tmp/tmp9yxyfcpq.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_lognormal1(self): """ Check lognormal with size argument with mean and variance.""" code = """ def numpy_lognormal1(size): from numpy.random import lognormal from numpy import var, mean, e m = 0 s = 1/4 rmean = e**(m+(s**2/2)) rvar = (e**(s**2) - 1)*e**(2*m+s**2) a = lognormal(size=size) return (abs(mean(a) - rmean) < .1 and abs(var(a) - rvar) < .1) """ > self.run_test(code, 10 ** 6, numpy_lognormal1=[int]) pythran/tests/test_numpy_random.py:890: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_lognormal1', cxxfile = '/tmp/tmp9yxyfcpq.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpljsa46t1', buildtmp = '/tmp/tmpy698rri_' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9yxyfcpq.cpp -o /tmp/tmpy698rri_/tmp/tmp9yxyfcpq.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_lognormal1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpy698rri_/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp9yxyfcpq.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp9yxyfcpq.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp9yxyfcpq.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! __________________ TestNumpyRandom.test_numpy_standard_gamma0 __________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_gamma0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp00aukt23.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp00aukt23.cpp'], output_dir = '/tmp/tmpubvelxyg' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpubvelxyg/tmp/tmp00aukt23.o', ('/tmp/tmp00aukt23.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpubvelxyg/tmp/tmp00aukt23.o', '/tmp/tmp00aukt23.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpubvelxyg/tmp/tmp00aukt23.o', src = '/tmp/tmp00aukt23.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp00aukt23.cpp -o /tmp/tmpubvelxyg/tmp/tmp00aukt23.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_standard_gamma0', cxxfile = '/tmp/tmp00aukt23.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpapi0ypk8', buildtmp = '/tmp/tmpubvelxyg' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_gamma0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp00aukt23.cpp -o /tmp/tmpubvelxyg/tmp/tmp00aukt23.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_standard_gamma0(self): """ Check standard_gamma without argument with mean and variance. """ code = """ def numpy_standard_gamma0(size): from numpy.random import standard_gamma from numpy import var, mean a = [standard_gamma(1) for x in range(size)] return (abs(mean(a) - 1) < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 5, numpy_standard_gamma0=[int]) pythran/tests/test_numpy_random.py:1261: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_standard_gamma0', cxxfile = '/tmp/tmp00aukt23.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpapi0ypk8', buildtmp = '/tmp/tmpubvelxyg' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp00aukt23.cpp -o /tmp/tmpubvelxyg/tmp/tmp00aukt23.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_standard_gamma0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpubvelxyg/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp00aukt23.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp00aukt23.cpp:27: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp00aukt23.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_gumbel0a ______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gumbel0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp104qw9xv.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp104qw9xv.cpp'], output_dir = '/tmp/tmp1uy1vb71' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp1uy1vb71/tmp/tmp104qw9xv.o', ('/tmp/tmp104qw9xv.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp1uy1vb71/tmp/tmp104qw9xv.o', '/tmp/tmp104qw9xv.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp1uy1vb71/tmp/tmp104qw9xv.o', src = '/tmp/tmp104qw9xv.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp104qw9xv.cpp -o /tmp/tmp1uy1vb71/tmp/tmp104qw9xv.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_gumbel0a', cxxfile = '/tmp/tmp104qw9xv.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp3mgepnit', buildtmp = '/tmp/tmp1uy1vb71' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gumbel0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp104qw9xv.cpp -o /tmp/tmp1uy1vb71/tmp/tmp104qw9xv.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_gumbel0a(self): """ Check gumbel with 1 argument with mean and variance. """ code = """ def numpy_gumbel0a(size): from numpy.random import gumbel from numpy import var, mean, pi u = 1 rmean = u + 0.57721 rvar = (pi**2/6) a = [gumbel(u) for x in range(size)] return (abs(mean(a) - rmean ) < 0.05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 6, numpy_gumbel0a=[int]) pythran/tests/test_numpy_random.py:1315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_gumbel0a', cxxfile = '/tmp/tmp104qw9xv.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp3mgepnit', buildtmp = '/tmp/tmp1uy1vb71' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp104qw9xv.cpp -o /tmp/tmp1uy1vb71/tmp/tmp104qw9xv.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_gumbel0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp1uy1vb71/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp104qw9xv.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp104qw9xv.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp104qw9xv.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_lognormal2 _____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_lognormal2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpzf1pgqnv.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpzf1pgqnv.cpp'], output_dir = '/tmp/tmp7ul8kudp' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp7ul8kudp/tmp/tmpzf1pgqnv.o', ('/tmp/tmpzf1pgqnv.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp7ul8kudp/tmp/tmpzf1pgqnv.o', '/tmp/tmpzf1pgqnv.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp7ul8kudp/tmp/tmpzf1pgqnv.o', src = '/tmp/tmpzf1pgqnv.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzf1pgqnv.cpp -o /tmp/tmp7ul8kudp/tmp/tmpzf1pgqnv.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_lognormal2', cxxfile = '/tmp/tmpzf1pgqnv.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp5eiaddz9', buildtmp = '/tmp/tmp7ul8kudp' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_lognormal2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzf1pgqnv.cpp -o /tmp/tmp7ul8kudp/tmp/tmpzf1pgqnv.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_lognormal2(self): """Check lognormal with shape argument with mean and variance.""" code = """ def numpy_lognormal2(size): from numpy.random import lognormal from numpy import mean, var, e m = 2 s = 1/2 rmean = e**(m+(s**2/2)) rvar = (e**(s**2) - 1)*e**(2*m+s**2) a = lognormal(size=(size, size)) return (abs(mean(a) - rmean) < .1 and abs(var(a) - rvar) < .1) """ > self.run_test(code, 10 ** 3, numpy_lognormal2=[int]) pythran/tests/test_numpy_random.py:905: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_lognormal2', cxxfile = '/tmp/tmpzf1pgqnv.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp5eiaddz9', buildtmp = '/tmp/tmp7ul8kudp' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzf1pgqnv.cpp -o /tmp/tmp7ul8kudp/tmp/tmpzf1pgqnv.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_lognormal2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp7ul8kudp/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpzf1pgqnv.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpzf1pgqnv.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpzf1pgqnv.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! __________________ TestNumpyRandom.test_numpy_standard_gamma1 __________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_gamma1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpq6g5wn2u.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpq6g5wn2u.cpp'], output_dir = '/tmp/tmp36dh2an0' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp36dh2an0/tmp/tmpq6g5wn2u.o', ('/tmp/tmpq6g5wn2u.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp36dh2an0/tmp/tmpq6g5wn2u.o', '/tmp/tmpq6g5wn2u.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp36dh2an0/tmp/tmpq6g5wn2u.o', src = '/tmp/tmpq6g5wn2u.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpq6g5wn2u.cpp -o /tmp/tmp36dh2an0/tmp/tmpq6g5wn2u.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_standard_gamma1', cxxfile = '/tmp/tmpq6g5wn2u.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpbvxjoemk', buildtmp = '/tmp/tmp36dh2an0' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_gamma1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpq6g5wn2u.cpp -o /tmp/tmp36dh2an0/tmp/tmpq6g5wn2u.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_standard_gamma1(self): """ Check standard_gamma with size argument with mean and variance.""" code = """ def numpy_standard_gamma1(size): from numpy.random import standard_gamma from numpy import var, mean a = standard_gamma(2, size) return (abs(mean(a) - 2) < .05 and abs(var(a) - 2) < .05) """ > self.run_test(code, 10 ** 5, numpy_standard_gamma1=[int]) pythran/tests/test_numpy_random.py:1272: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_standard_gamma1', cxxfile = '/tmp/tmpq6g5wn2u.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpbvxjoemk', buildtmp = '/tmp/tmp36dh2an0' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpq6g5wn2u.cpp -o /tmp/tmp36dh2an0/tmp/tmpq6g5wn2u.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_standard_gamma1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp36dh2an0/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpq6g5wn2u.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpq6g5wn2u.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpq6g5wn2u.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_gumbel0b ______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gumbel0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpt8hxgntj.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpt8hxgntj.cpp'], output_dir = '/tmp/tmpbscknop5' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpbscknop5/tmp/tmpt8hxgntj.o', ('/tmp/tmpt8hxgntj.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpbscknop5/tmp/tmpt8hxgntj.o', '/tmp/tmpt8hxgntj.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpbscknop5/tmp/tmpt8hxgntj.o', src = '/tmp/tmpt8hxgntj.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpt8hxgntj.cpp -o /tmp/tmpbscknop5/tmp/tmpt8hxgntj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_gumbel0b', cxxfile = '/tmp/tmpt8hxgntj.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmps4wj8p9t', buildtmp = '/tmp/tmpbscknop5' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gumbel0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpt8hxgntj.cpp -o /tmp/tmpbscknop5/tmp/tmpt8hxgntj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_gumbel0b(self): """ Check gumbel with 2 argument with mean and variance. """ code = """ def numpy_gumbel0b(size): from numpy.random import gumbel from numpy import var, mean, pi u = 1.5 s = 2 rmean = u + 0.57721*s rvar = (pi**2/6)*s**2 a = gumbel(u, s, size) return (abs(mean(a) - rmean) < 0.05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 6, numpy_gumbel0b=[int]) pythran/tests/test_numpy_random.py:1330: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_gumbel0b', cxxfile = '/tmp/tmpt8hxgntj.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmps4wj8p9t', buildtmp = '/tmp/tmpbscknop5' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpt8hxgntj.cpp -o /tmp/tmpbscknop5/tmp/tmpt8hxgntj.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_gumbel0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpbscknop5/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpt8hxgntj.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpt8hxgntj.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpt8hxgntj.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! __________________ TestNumpyRandom.test_numpy_standard_gamma2 __________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_gamma2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpe1p_tjn6.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpe1p_tjn6.cpp'], output_dir = '/tmp/tmp6e7_m3_f' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp6e7_m3_f/tmp/tmpe1p_tjn6.o', ('/tmp/tmpe1p_tjn6.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp6e7_m3_f/tmp/tmpe1p_tjn6.o', '/tmp/tmpe1p_tjn6.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp6e7_m3_f/tmp/tmpe1p_tjn6.o', src = '/tmp/tmpe1p_tjn6.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpe1p_tjn6.cpp -o /tmp/tmp6e7_m3_f/tmp/tmpe1p_tjn6.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_standard_gamma2', cxxfile = '/tmp/tmpe1p_tjn6.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpnxxafhp0', buildtmp = '/tmp/tmp6e7_m3_f' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_gamma2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpe1p_tjn6.cpp -o /tmp/tmp6e7_m3_f/tmp/tmpe1p_tjn6.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_standard_gamma2(self): """Check standard_gamma with shape argument with mean and variance.""" code = """ def numpy_standard_gamma2(size): from numpy.random import standard_gamma from numpy import mean, var a = standard_gamma(3, (size, size)) return (abs(mean(a) - 3) < .05 and abs(var(a) - 3) < .05) """ > self.run_test(code, 10 ** 3, numpy_standard_gamma2=[int]) pythran/tests/test_numpy_random.py:1283: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_standard_gamma2', cxxfile = '/tmp/tmpe1p_tjn6.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpnxxafhp0', buildtmp = '/tmp/tmp6e7_m3_f' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpe1p_tjn6.cpp -o /tmp/tmp6e7_m3_f/tmp/tmpe1p_tjn6.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_standard_gamma2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp6e7_m3_f/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpe1p_tjn6.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpe1p_tjn6.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpe1p_tjn6.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_logseries0 _____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logseries0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpm5xb0lew.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpm5xb0lew.cpp'], output_dir = '/tmp/tmpwju4x2qo' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpwju4x2qo/tmp/tmpm5xb0lew.o', ('/tmp/tmpm5xb0lew.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpwju4x2qo/tmp/tmpm5xb0lew.o', '/tmp/tmpm5xb0lew.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpwju4x2qo/tmp/tmpm5xb0lew.o', src = '/tmp/tmpm5xb0lew.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpm5xb0lew.cpp -o /tmp/tmpwju4x2qo/tmp/tmpm5xb0lew.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_logseries0', cxxfile = '/tmp/tmpm5xb0lew.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpznrcljs5', buildtmp = '/tmp/tmpwju4x2qo' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logseries0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpm5xb0lew.cpp -o /tmp/tmpwju4x2qo/tmp/tmpm5xb0lew.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_logseries0(self): """ Check logseries without argument with mean and variance. """ code = """ def numpy_logseries0(size): from numpy.random import logseries from numpy import var, mean, log s = 0.5 rmean = s / (log(1 - s)*(s - 1)) rvar = -(s*(s+log(1-s)))/((s - 1)**2*(log(1-s))**2) a = [logseries(s) for x in range(size)] return (abs(mean(a) - rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 5, numpy_logseries0=[int]) pythran/tests/test_numpy_random.py:1540: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_logseries0', cxxfile = '/tmp/tmpm5xb0lew.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpznrcljs5', buildtmp = '/tmp/tmpwju4x2qo' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpm5xb0lew.cpp -o /tmp/tmpwju4x2qo/tmp/tmpm5xb0lew.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_logseries0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpwju4x2qo/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpm5xb0lew.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpm5xb0lew.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpm5xb0lew.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_gumbel1 ______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gumbel1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpz8k_fc4m.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpz8k_fc4m.cpp'], output_dir = '/tmp/tmpd3ne7yn2' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpd3ne7yn2/tmp/tmpz8k_fc4m.o', ('/tmp/tmpz8k_fc4m.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpd3ne7yn2/tmp/tmpz8k_fc4m.o', '/tmp/tmpz8k_fc4m.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpd3ne7yn2/tmp/tmpz8k_fc4m.o', src = '/tmp/tmpz8k_fc4m.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpz8k_fc4m.cpp -o /tmp/tmpd3ne7yn2/tmp/tmpz8k_fc4m.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_gumbel1', cxxfile = '/tmp/tmpz8k_fc4m.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpbasq35jf', buildtmp = '/tmp/tmpd3ne7yn2' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gumbel1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpz8k_fc4m.cpp -o /tmp/tmpd3ne7yn2/tmp/tmpz8k_fc4m.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_gumbel1(self): """ Check gumbel with size argument with mean and variance.""" code = """ def numpy_gumbel1(size): from numpy.random import gumbel from numpy import var, mean from numpy import var, mean, pi u = 0. s = 1 rmean = u + 0.57721*s rvar = (pi**2/6)*s**2 a = gumbel(size=size) return (abs(mean(a) - rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 6, numpy_gumbel1=[int]) pythran/tests/test_numpy_random.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_gumbel1', cxxfile = '/tmp/tmpz8k_fc4m.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpbasq35jf', buildtmp = '/tmp/tmpd3ne7yn2' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpz8k_fc4m.cpp -o /tmp/tmpd3ne7yn2/tmp/tmpz8k_fc4m.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_gumbel1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpd3ne7yn2/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpz8k_fc4m.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpz8k_fc4m.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpz8k_fc4m.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _________________ TestNumpyRandom.test_numpy_standard_normal0 __________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_normal0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp2qekmmuw.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp2qekmmuw.cpp'], output_dir = '/tmp/tmpdj5a2v0l' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpdj5a2v0l/tmp/tmp2qekmmuw.o', ('/tmp/tmp2qekmmuw.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpdj5a2v0l/tmp/tmp2qekmmuw.o', '/tmp/tmp2qekmmuw.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpdj5a2v0l/tmp/tmp2qekmmuw.o', src = '/tmp/tmp2qekmmuw.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp2qekmmuw.cpp -o /tmp/tmpdj5a2v0l/tmp/tmp2qekmmuw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_standard_normal0', cxxfile = '/tmp/tmp2qekmmuw.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpwdk9s7zq', buildtmp = '/tmp/tmpdj5a2v0l' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_normal0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp2qekmmuw.cpp -o /tmp/tmpdj5a2v0l/tmp/tmp2qekmmuw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_standard_normal0(self): """ Check standard_normal without argument with mean and variance. """ code = """ def numpy_standard_normal0(size): from numpy.random import standard_normal from numpy import var, mean a = [standard_normal() for x in range(size)] print(mean(a)) return (abs(mean(a)) < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 5, numpy_standard_normal0=[int]) pythran/tests/test_numpy_random.py:237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_standard_normal0', cxxfile = '/tmp/tmp2qekmmuw.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpwdk9s7zq', buildtmp = '/tmp/tmpdj5a2v0l' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp2qekmmuw.cpp -o /tmp/tmpdj5a2v0l/tmp/tmp2qekmmuw.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_standard_normal0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpdj5a2v0l/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp2qekmmuw.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp2qekmmuw.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp2qekmmuw.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_logseries1 _____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logseries1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpp4izcfd6.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpp4izcfd6.cpp'], output_dir = '/tmp/tmpxte6yhzp' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpxte6yhzp/tmp/tmpp4izcfd6.o', ('/tmp/tmpp4izcfd6.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpxte6yhzp/tmp/tmpp4izcfd6.o', '/tmp/tmpp4izcfd6.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpxte6yhzp/tmp/tmpp4izcfd6.o', src = '/tmp/tmpp4izcfd6.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpp4izcfd6.cpp -o /tmp/tmpxte6yhzp/tmp/tmpp4izcfd6.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_logseries1', cxxfile = '/tmp/tmpp4izcfd6.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpqvupq60g', buildtmp = '/tmp/tmpxte6yhzp' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logseries1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpp4izcfd6.cpp -o /tmp/tmpxte6yhzp/tmp/tmpp4izcfd6.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_logseries1(self): """ Check logseries with size argument with mean and variance.""" code = """ def numpy_logseries1(size): from numpy.random import logseries from numpy import var, mean, log s = 0.25 rmean = s / (log(1 - s)*(s - 1)) rvar = -(s*(s+log(1-s)))/((s - 1)**2*(log(1-s))**2) a = logseries(s, size) return (abs(mean(a) - rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 5, numpy_logseries1=[int]) pythran/tests/test_numpy_random.py:1554: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_logseries1', cxxfile = '/tmp/tmpp4izcfd6.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpqvupq60g', buildtmp = '/tmp/tmpxte6yhzp' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpp4izcfd6.cpp -o /tmp/tmpxte6yhzp/tmp/tmpp4izcfd6.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_logseries1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpxte6yhzp/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpp4izcfd6.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpp4izcfd6.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpp4izcfd6.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_gumbel2 ______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gumbel2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp7j3b0gb1.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp7j3b0gb1.cpp'], output_dir = '/tmp/tmpipmhngfa' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpipmhngfa/tmp/tmp7j3b0gb1.o', ('/tmp/tmp7j3b0gb1.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpipmhngfa/tmp/tmp7j3b0gb1.o', '/tmp/tmp7j3b0gb1.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpipmhngfa/tmp/tmp7j3b0gb1.o', src = '/tmp/tmp7j3b0gb1.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp7j3b0gb1.cpp -o /tmp/tmpipmhngfa/tmp/tmp7j3b0gb1.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_gumbel2', cxxfile = '/tmp/tmp7j3b0gb1.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpuai017p6', buildtmp = '/tmp/tmpipmhngfa' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_gumbel2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp7j3b0gb1.cpp -o /tmp/tmpipmhngfa/tmp/tmp7j3b0gb1.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_gumbel2(self): """Check gumbel with shape argument with mean and variance.""" code = """ def numpy_gumbel2(size): from numpy.random import gumbel from numpy import mean, var from numpy import var, mean, pi u = 0 s = 1 rmean = u + 0.57721*s rvar = (pi**2/6)*s**2 a = gumbel(size=(size, size)) return (abs(mean(a) - rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 3, numpy_gumbel2=[int]) pythran/tests/test_numpy_random.py:1362: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_gumbel2', cxxfile = '/tmp/tmp7j3b0gb1.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpuai017p6', buildtmp = '/tmp/tmpipmhngfa' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp7j3b0gb1.cpp -o /tmp/tmpipmhngfa/tmp/tmp7j3b0gb1.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_gumbel2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpipmhngfa/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp7j3b0gb1.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp7j3b0gb1.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp7j3b0gb1.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _________________ TestNumpyRandom.test_numpy_standard_normal1 __________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_normal1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpr3tzu05x.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpr3tzu05x.cpp'], output_dir = '/tmp/tmpq2c4vprk' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpq2c4vprk/tmp/tmpr3tzu05x.o', ('/tmp/tmpr3tzu05x.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpq2c4vprk/tmp/tmpr3tzu05x.o', '/tmp/tmpr3tzu05x.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpq2c4vprk/tmp/tmpr3tzu05x.o', src = '/tmp/tmpr3tzu05x.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpr3tzu05x.cpp -o /tmp/tmpq2c4vprk/tmp/tmpr3tzu05x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_standard_normal1', cxxfile = '/tmp/tmpr3tzu05x.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpuankx6jr', buildtmp = '/tmp/tmpq2c4vprk' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_normal1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpr3tzu05x.cpp -o /tmp/tmpq2c4vprk/tmp/tmpr3tzu05x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_standard_normal1(self): """ Check standard_normal with size argument with mean and variance.""" code = """ def numpy_standard_normal1(size): from numpy.random import standard_normal from numpy import var, mean a = standard_normal(size) print(mean(a)) return (abs(mean(a)) < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 5, numpy_standard_normal1=[int]) pythran/tests/test_numpy_random.py:249: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_standard_normal1', cxxfile = '/tmp/tmpr3tzu05x.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpuankx6jr', buildtmp = '/tmp/tmpq2c4vprk' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpr3tzu05x.cpp -o /tmp/tmpq2c4vprk/tmp/tmpr3tzu05x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_standard_normal1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpq2c4vprk/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpr3tzu05x.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpr3tzu05x.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpr3tzu05x.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _________________ TestNumpyRandom.test_numpy_standard_normal2 __________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_normal2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpfj7e5tfd.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpfj7e5tfd.cpp'], output_dir = '/tmp/tmpq89yaqni' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpq89yaqni/tmp/tmpfj7e5tfd.o', ('/tmp/tmpfj7e5tfd.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpq89yaqni/tmp/tmpfj7e5tfd.o', '/tmp/tmpfj7e5tfd.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpq89yaqni/tmp/tmpfj7e5tfd.o', src = '/tmp/tmpfj7e5tfd.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfj7e5tfd.cpp -o /tmp/tmpq89yaqni/tmp/tmpfj7e5tfd.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_standard_normal2', cxxfile = '/tmp/tmpfj7e5tfd.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpl6_8oiw4', buildtmp = '/tmp/tmpq89yaqni' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_standard_normal2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfj7e5tfd.cpp -o /tmp/tmpq89yaqni/tmp/tmpfj7e5tfd.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_standard_normal2(self): """Check standard_normal with shape argument with mean and variance.""" code = """ def numpy_standard_normal2(size): from numpy.random import standard_normal from numpy import mean, var a = standard_normal((size, size)) print(mean(a)) return (abs(mean(a)) < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 3, numpy_standard_normal2=[int]) pythran/tests/test_numpy_random.py:261: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_standard_normal2', cxxfile = '/tmp/tmpfj7e5tfd.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpl6_8oiw4', buildtmp = '/tmp/tmpq89yaqni' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpfj7e5tfd.cpp -o /tmp/tmpq89yaqni/tmp/tmpfj7e5tfd.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_standard_normal2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpq89yaqni/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpfj7e5tfd.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpfj7e5tfd.cpp:25: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpfj7e5tfd.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ____________________ TestNumpyRandom.test_numpy_logseries2 _____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logseries2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp66a_hiu0.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp66a_hiu0.cpp'], output_dir = '/tmp/tmpomznbmol' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpomznbmol/tmp/tmp66a_hiu0.o', ('/tmp/tmp66a_hiu0.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpomznbmol/tmp/tmp66a_hiu0.o', '/tmp/tmp66a_hiu0.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpomznbmol/tmp/tmp66a_hiu0.o', src = '/tmp/tmp66a_hiu0.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp66a_hiu0.cpp -o /tmp/tmpomznbmol/tmp/tmp66a_hiu0.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_logseries2', cxxfile = '/tmp/tmp66a_hiu0.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp_phgu245', buildtmp = '/tmp/tmpomznbmol' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_logseries2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp66a_hiu0.cpp -o /tmp/tmpomznbmol/tmp/tmp66a_hiu0.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_logseries2(self): """Check logseries with shape argument with mean and variance.""" code = """ def numpy_logseries2(size): from numpy.random import logseries from numpy import mean, var, log s = 0.2 rmean = s / (log(1 - s)*(s - 1)) rvar = -(s*(s+log(1-s)))/((s - 1)**2*(log(1-s))**2) a = logseries(s, (size, size)) return (abs(mean(a) - rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 3, numpy_logseries2=[int]) pythran/tests/test_numpy_random.py:1568: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_logseries2', cxxfile = '/tmp/tmp66a_hiu0.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp_phgu245', buildtmp = '/tmp/tmpomznbmol' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp66a_hiu0.cpp -o /tmp/tmpomznbmol/tmp/tmp66a_hiu0.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_logseries2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpomznbmol/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp66a_hiu0.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp66a_hiu0.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp66a_hiu0.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_laplace0 ______________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_laplace0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpglffmhiv.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpglffmhiv.cpp'], output_dir = '/tmp/tmp2id6sf7v' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp2id6sf7v/tmp/tmpglffmhiv.o', ('/tmp/tmpglffmhiv.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp2id6sf7v/tmp/tmpglffmhiv.o', '/tmp/tmpglffmhiv.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp2id6sf7v/tmp/tmpglffmhiv.o', src = '/tmp/tmpglffmhiv.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpglffmhiv.cpp -o /tmp/tmp2id6sf7v/tmp/tmpglffmhiv.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_laplace0', cxxfile = '/tmp/tmpglffmhiv.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmprs5_nqsp', buildtmp = '/tmp/tmp2id6sf7v' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_laplace0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpglffmhiv.cpp -o /tmp/tmp2id6sf7v/tmp/tmpglffmhiv.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_laplace0(self): """ Check laplace without argument with mean and variance. """ code = """ def numpy_laplace0(size): from numpy.random import laplace from numpy import var, mean, pi u = 0. s = 1 rmean = u rvar = 2*s**2 a = [laplace() for x in range(size)] return (abs(mean(a) - rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 6, numpy_laplace0=[int]) pythran/tests/test_numpy_random.py:1460: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_laplace0', cxxfile = '/tmp/tmpglffmhiv.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmprs5_nqsp', buildtmp = '/tmp/tmp2id6sf7v' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpglffmhiv.cpp -o /tmp/tmp2id6sf7v/tmp/tmpglffmhiv.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_laplace0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp2id6sf7v/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpglffmhiv.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpglffmhiv.cpp:27: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpglffmhiv.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! __________________ TestNumpyRandom.test_numpy_uniform_no_arg ___________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_uniform_no_arg', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp162o_4po.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp162o_4po.cpp'], output_dir = '/tmp/tmpl2hsbh5b' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpl2hsbh5b/tmp/tmp162o_4po.o', ('/tmp/tmp162o_4po.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpl2hsbh5b/tmp/tmp162o_4po.o', '/tmp/tmp162o_4po.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpl2hsbh5b/tmp/tmp162o_4po.o', src = '/tmp/tmp162o_4po.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp162o_4po.cpp -o /tmp/tmpl2hsbh5b/tmp/tmp162o_4po.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_uniform_no_arg', cxxfile = '/tmp/tmp162o_4po.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp3_o_n4gs', buildtmp = '/tmp/tmpl2hsbh5b' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_uniform_no_arg', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp162o_4po.cpp -o /tmp/tmpl2hsbh5b/tmp/tmp162o_4po.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_uniform_no_arg(self): """ Check logseries without argument with mean and variance. """ code = """ def numpy_uniform_no_arg(size): import numpy as np from numpy.random import uniform low, high = 0.0, 1.0 a = np.array([uniform() for _ in range(size)]) rmean = 0.5 * (low + high) rvar = (high - low) ** 2 / 12 cond_mean = (a.mean() - rmean) / rmean < 0.05 cond_var = (np.var(a) - rvar) / rvar < 0.05 return cond_mean and cond_var """ > self.run_test(code, 4000, numpy_uniform_no_arg=[int]) pythran/tests/test_numpy_random.py:1588: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_uniform_no_arg', cxxfile = '/tmp/tmp162o_4po.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp3_o_n4gs', buildtmp = '/tmp/tmpl2hsbh5b' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp162o_4po.cpp -o /tmp/tmpl2hsbh5b/tmp/tmp162o_4po.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_uniform_no_arg' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpl2hsbh5b/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp162o_4po.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp162o_4po.cpp:32: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp162o_4po.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_normal0 ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_normal0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpksqtqdg8.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpksqtqdg8.cpp'], output_dir = '/tmp/tmprdww0jgh' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmprdww0jgh/tmp/tmpksqtqdg8.o', ('/tmp/tmpksqtqdg8.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmprdww0jgh/tmp/tmpksqtqdg8.o', '/tmp/tmpksqtqdg8.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmprdww0jgh/tmp/tmpksqtqdg8.o', src = '/tmp/tmpksqtqdg8.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpksqtqdg8.cpp -o /tmp/tmprdww0jgh/tmp/tmpksqtqdg8.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_normal0', cxxfile = '/tmp/tmpksqtqdg8.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpjg_cou_3', buildtmp = '/tmp/tmprdww0jgh' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_normal0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpksqtqdg8.cpp -o /tmp/tmprdww0jgh/tmp/tmpksqtqdg8.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_normal0(self): """ Check normal without argument with mean and variance. """ code = """ def numpy_normal0(size): from numpy.random import normal from numpy import var, mean a = [normal() for x in range(size)] print(mean(a)) return (abs(mean(a)) < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 5, numpy_normal0=[int]) pythran/tests/test_numpy_random.py:277: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_normal0', cxxfile = '/tmp/tmpksqtqdg8.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpjg_cou_3', buildtmp = '/tmp/tmprdww0jgh' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpksqtqdg8.cpp -o /tmp/tmprdww0jgh/tmp/tmpksqtqdg8.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_normal0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmprdww0jgh/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpksqtqdg8.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpksqtqdg8.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpksqtqdg8.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_laplace0a _____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_laplace0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpte76bssl.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpte76bssl.cpp'], output_dir = '/tmp/tmp0el5kr47' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp0el5kr47/tmp/tmpte76bssl.o', ('/tmp/tmpte76bssl.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp0el5kr47/tmp/tmpte76bssl.o', '/tmp/tmpte76bssl.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp0el5kr47/tmp/tmpte76bssl.o', src = '/tmp/tmpte76bssl.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpte76bssl.cpp -o /tmp/tmp0el5kr47/tmp/tmpte76bssl.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_laplace0a', cxxfile = '/tmp/tmpte76bssl.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppny3xn7t', buildtmp = '/tmp/tmp0el5kr47' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_laplace0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpte76bssl.cpp -o /tmp/tmp0el5kr47/tmp/tmpte76bssl.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_laplace0a(self): """ Check laplace with 1 argument with mean and variance. """ code = """ def numpy_laplace0a(size): from numpy.random import laplace from numpy import var, mean, pi u = 2 s = 1 rmean = u rvar = 2*s**2 a = [laplace(u) for x in range(size)] return (abs(mean(a) - rmean ) < 0.05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 6, numpy_laplace0a=[int]) pythran/tests/test_numpy_random.py:1475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_laplace0a', cxxfile = '/tmp/tmpte76bssl.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppny3xn7t', buildtmp = '/tmp/tmp0el5kr47' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpte76bssl.cpp -o /tmp/tmp0el5kr47/tmp/tmpte76bssl.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_laplace0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp0el5kr47/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpte76bssl.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpte76bssl.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpte76bssl.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _________________ TestNumpyRandom.test_numpy_uniform_size_int __________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_uniform_size_int', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp9377gmsc.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp9377gmsc.cpp'], output_dir = '/tmp/tmpm4afkgf_' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpm4afkgf_/tmp/tmp9377gmsc.o', ('/tmp/tmp9377gmsc.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpm4afkgf_/tmp/tmp9377gmsc.o', '/tmp/tmp9377gmsc.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpm4afkgf_/tmp/tmp9377gmsc.o', src = '/tmp/tmp9377gmsc.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9377gmsc.cpp -o /tmp/tmpm4afkgf_/tmp/tmp9377gmsc.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_uniform_size_int', cxxfile = '/tmp/tmp9377gmsc.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpp59rcctb', buildtmp = '/tmp/tmpm4afkgf_' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_uniform_size_int', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9377gmsc.cpp -o /tmp/tmpm4afkgf_/tmp/tmp9377gmsc.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_uniform_size_int(self): """ Check logseries with arguments with mean and variance. """ code = """ def numpy_uniform_size_int(size): import numpy as np from numpy.random import uniform low, high = 0., 1234. rmean = 0.5 * (low + high) rvar = (high - low) ** 2 / 12 a = uniform(low, high, size) cond_mean = (a.mean() - rmean) / rmean < 0.05 cond_var = (np.var(a) - rvar) / rvar < 0.05 return cond_mean and cond_var """ > self.run_test(code, 4000, numpy_uniform_size_int=[int]) pythran/tests/test_numpy_random.py:1604: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_uniform_size_int', cxxfile = '/tmp/tmp9377gmsc.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpp59rcctb', buildtmp = '/tmp/tmpm4afkgf_' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp9377gmsc.cpp -o /tmp/tmpm4afkgf_/tmp/tmp9377gmsc.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_uniform_size_int' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpm4afkgf_/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp9377gmsc.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp9377gmsc.cpp:26: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp9377gmsc.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_normal0a ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_normal0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp4k6_p6i3.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp4k6_p6i3.cpp'], output_dir = '/tmp/tmpbbvc6dvg' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpbbvc6dvg/tmp/tmp4k6_p6i3.o', ('/tmp/tmp4k6_p6i3.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpbbvc6dvg/tmp/tmp4k6_p6i3.o', '/tmp/tmp4k6_p6i3.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpbbvc6dvg/tmp/tmp4k6_p6i3.o', src = '/tmp/tmp4k6_p6i3.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp4k6_p6i3.cpp -o /tmp/tmpbbvc6dvg/tmp/tmp4k6_p6i3.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_normal0a', cxxfile = '/tmp/tmp4k6_p6i3.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppdbjy3dd', buildtmp = '/tmp/tmpbbvc6dvg' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_normal0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp4k6_p6i3.cpp -o /tmp/tmpbbvc6dvg/tmp/tmp4k6_p6i3.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_normal0a(self): """ Check normal with 1 argument with mean and variance. """ code = """ def numpy_normal0a(size): from numpy.random import normal from numpy import var, mean a = [normal(3.) for x in range(size)] print(mean(a)) return (abs(mean(a)-3) < 0.05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 5, numpy_normal0a=[int]) pythran/tests/test_numpy_random.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_normal0a', cxxfile = '/tmp/tmp4k6_p6i3.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppdbjy3dd', buildtmp = '/tmp/tmpbbvc6dvg' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp4k6_p6i3.cpp -o /tmp/tmpbbvc6dvg/tmp/tmp4k6_p6i3.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_normal0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpbbvc6dvg/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp4k6_p6i3.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp4k6_p6i3.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp4k6_p6i3.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_laplace0b _____________________ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_laplace0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp1omru3rt.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp1omru3rt.cpp'], output_dir = '/tmp/tmpeezt_mzo' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpeezt_mzo/tmp/tmp1omru3rt.o', ('/tmp/tmp1omru3rt.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpeezt_mzo/tmp/tmp1omru3rt.o', '/tmp/tmp1omru3rt.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpeezt_mzo/tmp/tmp1omru3rt.o', src = '/tmp/tmp1omru3rt.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp1omru3rt.cpp -o /tmp/tmpeezt_mzo/tmp/tmp1omru3rt.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_laplace0b', cxxfile = '/tmp/tmp1omru3rt.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpds_o_xw5', buildtmp = '/tmp/tmpeezt_mzo' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_laplace0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp1omru3rt.cpp -o /tmp/tmpeezt_mzo/tmp/tmp1omru3rt.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_laplace0b(self): """ Check laplace with 2 argument with mean and variance. """ code = """ def numpy_laplace0b(size): from numpy.random import laplace from numpy import var, mean, pi u = 2. s = 2 rmean = u rvar = 2*s**2 a = laplace(u, s, size) return (abs(mean(a) - rmean) < 0.05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 6, numpy_laplace0b=[int]) pythran/tests/test_numpy_random.py:1490: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_laplace0b', cxxfile = '/tmp/tmp1omru3rt.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpds_o_xw5', buildtmp = '/tmp/tmpeezt_mzo' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp1omru3rt.cpp -o /tmp/tmpeezt_mzo/tmp/tmp1omru3rt.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_laplace0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpeezt_mzo/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp1omru3rt.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp1omru3rt.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp1omru3rt.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ________________ TestNumpyRandom.test_numpy_uniform_size_tuple _________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_uniform_size_tuple', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmphg_oknab.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmphg_oknab.cpp'], output_dir = '/tmp/tmpz9v30qfv' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpz9v30qfv/tmp/tmphg_oknab.o', ('/tmp/tmphg_oknab.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpz9v30qfv/tmp/tmphg_oknab.o', '/tmp/tmphg_oknab.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpz9v30qfv/tmp/tmphg_oknab.o', src = '/tmp/tmphg_oknab.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphg_oknab.cpp -o /tmp/tmpz9v30qfv/tmp/tmphg_oknab.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_uniform_size_tuple', cxxfile = '/tmp/tmphg_oknab.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp1mju17w7', buildtmp = '/tmp/tmpz9v30qfv' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_uniform_size_tuple', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphg_oknab.cpp -o /tmp/tmpz9v30qfv/tmp/tmphg_oknab.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_uniform_size_tuple(self): """ Check logseries with arguments with mean and variance. """ code = """ def numpy_uniform_size_tuple(size): import numpy as np from numpy.random import uniform low, high = -987., 12345. rmean = 0.5 * (low + high) rvar = (high - low) ** 2 / 12 a = uniform(low, high, (size, size)) cond_mean = (a.mean() - rmean) / rmean < 0.05 cond_var = (np.var(a) - rvar) / rvar < 0.05 return cond_mean and cond_var """ > self.run_test(code, 70, numpy_uniform_size_tuple=[int]) pythran/tests/test_numpy_random.py:1621: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_uniform_size_tuple', cxxfile = '/tmp/tmphg_oknab.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp1mju17w7', buildtmp = '/tmp/tmpz9v30qfv' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphg_oknab.cpp -o /tmp/tmpz9v30qfv/tmp/tmphg_oknab.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_uniform_size_tuple' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpz9v30qfv/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmphg_oknab.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmphg_oknab.cpp:26: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmphg_oknab.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_normal0b ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_normal0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp0dr7ggpo.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp0dr7ggpo.cpp'], output_dir = '/tmp/tmpnq9fr7z9' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpnq9fr7z9/tmp/tmp0dr7ggpo.o', ('/tmp/tmp0dr7ggpo.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpnq9fr7z9/tmp/tmp0dr7ggpo.o', '/tmp/tmp0dr7ggpo.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpnq9fr7z9/tmp/tmp0dr7ggpo.o', src = '/tmp/tmp0dr7ggpo.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp0dr7ggpo.cpp -o /tmp/tmpnq9fr7z9/tmp/tmp0dr7ggpo.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_normal0b', cxxfile = '/tmp/tmp0dr7ggpo.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpcnut216g', buildtmp = '/tmp/tmpnq9fr7z9' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_normal0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp0dr7ggpo.cpp -o /tmp/tmpnq9fr7z9/tmp/tmp0dr7ggpo.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_normal0b(self): """ Check normal with 2 argument with mean and variance. """ code = """ def numpy_normal0b(size): from numpy.random import normal from numpy import var, mean, sqrt mu, sigma = 0, 0.1 a = normal(mu, sigma, size) print(mean(a)) return (abs(mu - mean(a)) < 0.05 and abs(sigma - sqrt(var(a,ddof=1))) < .05) """ > self.run_test(code, 10 ** 5, numpy_normal0b=[int]) pythran/tests/test_numpy_random.py:302: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_normal0b', cxxfile = '/tmp/tmp0dr7ggpo.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpcnut216g', buildtmp = '/tmp/tmpnq9fr7z9' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp0dr7ggpo.cpp -o /tmp/tmpnq9fr7z9/tmp/tmp0dr7ggpo.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_normal0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpnq9fr7z9/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp0dr7ggpo.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp0dr7ggpo.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp0dr7ggpo.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_normal1 ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_normal1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp6yvfjm4e.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp6yvfjm4e.cpp'], output_dir = '/tmp/tmp6lrq_e_s' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp6lrq_e_s/tmp/tmp6yvfjm4e.o', ('/tmp/tmp6yvfjm4e.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp6lrq_e_s/tmp/tmp6yvfjm4e.o', '/tmp/tmp6yvfjm4e.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp6lrq_e_s/tmp/tmp6yvfjm4e.o', src = '/tmp/tmp6yvfjm4e.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp6yvfjm4e.cpp -o /tmp/tmp6lrq_e_s/tmp/tmp6yvfjm4e.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_normal1', cxxfile = '/tmp/tmp6yvfjm4e.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpaoca_rvb', buildtmp = '/tmp/tmp6lrq_e_s' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_normal1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp6yvfjm4e.cpp -o /tmp/tmp6lrq_e_s/tmp/tmp6yvfjm4e.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_normal1(self): """ Check normal with size argument with mean and variance.""" code = """ def numpy_normal1(size): from numpy.random import normal from numpy import var, mean a = normal(size=size) print(mean(a)) return (abs(mean(a)) < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 5, numpy_normal1=[int]) pythran/tests/test_numpy_random.py:316: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_normal1', cxxfile = '/tmp/tmp6yvfjm4e.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpaoca_rvb', buildtmp = '/tmp/tmp6lrq_e_s' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp6yvfjm4e.cpp -o /tmp/tmp6lrq_e_s/tmp/tmp6yvfjm4e.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_normal1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp6lrq_e_s/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp6yvfjm4e.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp6yvfjm4e.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp6yvfjm4e.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_weibull0a _____________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_weibull0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpc63r5hqb.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpc63r5hqb.cpp'], output_dir = '/tmp/tmp1096sr6d' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp1096sr6d/tmp/tmpc63r5hqb.o', ('/tmp/tmpc63r5hqb.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp1096sr6d/tmp/tmpc63r5hqb.o', '/tmp/tmpc63r5hqb.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp1096sr6d/tmp/tmpc63r5hqb.o', src = '/tmp/tmpc63r5hqb.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpc63r5hqb.cpp -o /tmp/tmp1096sr6d/tmp/tmpc63r5hqb.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_weibull0a', cxxfile = '/tmp/tmpc63r5hqb.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpicm90pmp', buildtmp = '/tmp/tmp1096sr6d' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_weibull0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpc63r5hqb.cpp -o /tmp/tmp1096sr6d/tmp/tmpc63r5hqb.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_weibull0a(self): """ Check weibull with 1 argument with mean and variance. """ code = """ def numpy_weibull0a(size): from numpy.random import weibull from numpy import var, mean pa = 3. a = [weibull(pa) for x in range(size)] return (abs(mean(a) - pa) < 0.05 and abs(var(a) - 2*pa) < .05) """ > self.run_test(code, 10 ** 6, numpy_weibull0a=[int]) pythran/tests/test_numpy_random.py:802: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_weibull0a', cxxfile = '/tmp/tmpc63r5hqb.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpicm90pmp', buildtmp = '/tmp/tmp1096sr6d' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpc63r5hqb.cpp -o /tmp/tmp1096sr6d/tmp/tmpc63r5hqb.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_weibull0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp1096sr6d/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpc63r5hqb.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpc63r5hqb.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpc63r5hqb.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_normal2 ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_normal2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpugbs3569.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpugbs3569.cpp'], output_dir = '/tmp/tmp3y69lud2' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp3y69lud2/tmp/tmpugbs3569.o', ('/tmp/tmpugbs3569.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp3y69lud2/tmp/tmpugbs3569.o', '/tmp/tmpugbs3569.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp3y69lud2/tmp/tmpugbs3569.o', src = '/tmp/tmpugbs3569.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpugbs3569.cpp -o /tmp/tmp3y69lud2/tmp/tmpugbs3569.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_normal2', cxxfile = '/tmp/tmpugbs3569.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmplea9ti8y', buildtmp = '/tmp/tmp3y69lud2' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_normal2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpugbs3569.cpp -o /tmp/tmp3y69lud2/tmp/tmpugbs3569.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_normal2(self): """Check normal with shape argument with mean and variance.""" code = """ def numpy_normal2(size): from numpy.random import normal from numpy import mean, var a = normal(size=(size, size)) print(mean(a)) return (abs(mean(a)) < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 3, numpy_normal2=[int]) pythran/tests/test_numpy_random.py:328: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_normal2', cxxfile = '/tmp/tmpugbs3569.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmplea9ti8y', buildtmp = '/tmp/tmp3y69lud2' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpugbs3569.cpp -o /tmp/tmp3y69lud2/tmp/tmpugbs3569.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_normal2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp3y69lud2/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpugbs3569.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpugbs3569.cpp:25: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpugbs3569.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_weibull0b _____________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_weibull0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpmz0b6c4x.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpmz0b6c4x.cpp'], output_dir = '/tmp/tmpd1dzq33q' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpd1dzq33q/tmp/tmpmz0b6c4x.o', ('/tmp/tmpmz0b6c4x.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpd1dzq33q/tmp/tmpmz0b6c4x.o', '/tmp/tmpmz0b6c4x.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpd1dzq33q/tmp/tmpmz0b6c4x.o', src = '/tmp/tmpmz0b6c4x.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpmz0b6c4x.cpp -o /tmp/tmpd1dzq33q/tmp/tmpmz0b6c4x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_weibull0b', cxxfile = '/tmp/tmpmz0b6c4x.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpvx84syxe', buildtmp = '/tmp/tmpd1dzq33q' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_weibull0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpmz0b6c4x.cpp -o /tmp/tmpd1dzq33q/tmp/tmpmz0b6c4x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_weibull0b(self): """ Check weibull with 2 argument with mean and variance. """ code = """ def numpy_weibull0b(size): from numpy.random import weibull from numpy import var, mean, sqrt pa = 2 a = weibull(pa, size) return (abs(mean(a) - pa) < 0.05 and abs(var(a) - pa*2 ) < .05) """ > self.run_test(code, 10 ** 6, numpy_weibull0b=[int]) pythran/tests/test_numpy_random.py:814: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_weibull0b', cxxfile = '/tmp/tmpmz0b6c4x.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpvx84syxe', buildtmp = '/tmp/tmpd1dzq33q' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpmz0b6c4x.cpp -o /tmp/tmpd1dzq33q/tmp/tmpmz0b6c4x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_weibull0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpd1dzq33q/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpmz0b6c4x.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpmz0b6c4x.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpmz0b6c4x.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_pareto0a ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_pareto0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpeh4htama.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpeh4htama.cpp'], output_dir = '/tmp/tmpphrwhfc7' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpphrwhfc7/tmp/tmpeh4htama.o', ('/tmp/tmpeh4htama.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpphrwhfc7/tmp/tmpeh4htama.o', '/tmp/tmpeh4htama.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpphrwhfc7/tmp/tmpeh4htama.o', src = '/tmp/tmpeh4htama.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpeh4htama.cpp -o /tmp/tmpphrwhfc7/tmp/tmpeh4htama.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_pareto0a', cxxfile = '/tmp/tmpeh4htama.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmps6ye8mgf', buildtmp = '/tmp/tmpphrwhfc7' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_pareto0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpeh4htama.cpp -o /tmp/tmpphrwhfc7/tmp/tmpeh4htama.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_pareto0a(self): """ Check pareto with 1 argument with mean and variance. """ code = """ def numpy_pareto0a(size): from numpy.random import pareto from numpy import var, mean alpha = 10 rvar = alpha/((alpha-1)**2*(alpha-2)) a = [pareto(alpha) for x in range(size)] return (abs(mean(a)- 0.5) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 6, numpy_pareto0a=[int]) pythran/tests/test_numpy_random.py:965: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_pareto0a', cxxfile = '/tmp/tmpeh4htama.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmps6ye8mgf', buildtmp = '/tmp/tmpphrwhfc7' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpeh4htama.cpp -o /tmp/tmpphrwhfc7/tmp/tmpeh4htama.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_pareto0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpphrwhfc7/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpeh4htama.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpeh4htama.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpeh4htama.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_weibull2 ______________________ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_weibull2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpizqjlh6k.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpizqjlh6k.cpp'], output_dir = '/tmp/tmp4eso6hbg' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp4eso6hbg/tmp/tmpizqjlh6k.o', ('/tmp/tmpizqjlh6k.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp4eso6hbg/tmp/tmpizqjlh6k.o', '/tmp/tmpizqjlh6k.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp4eso6hbg/tmp/tmpizqjlh6k.o', src = '/tmp/tmpizqjlh6k.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpizqjlh6k.cpp -o /tmp/tmp4eso6hbg/tmp/tmpizqjlh6k.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_weibull2', cxxfile = '/tmp/tmpizqjlh6k.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpfsbg4bi6', buildtmp = '/tmp/tmp4eso6hbg' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_weibull2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpizqjlh6k.cpp -o /tmp/tmp4eso6hbg/tmp/tmpizqjlh6k.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_weibull2(self): """Check weibull with shape argument with mean and variance.""" code = """ def numpy_weibull2(size): from numpy.random import weibull from numpy import mean, var pa = 1 a = weibull(pa, size=(size, size)) return (abs(mean(a)) - pa < .05 and abs(var(a) - 2*pa) < .05) """ > self.run_test(code, 10 ** 3, numpy_weibull2=[int]) pythran/tests/test_numpy_random.py:826: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_weibull2', cxxfile = '/tmp/tmpizqjlh6k.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpfsbg4bi6', buildtmp = '/tmp/tmp4eso6hbg' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpizqjlh6k.cpp -o /tmp/tmp4eso6hbg/tmp/tmpizqjlh6k.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_weibull2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp4eso6hbg/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpizqjlh6k.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpizqjlh6k.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpizqjlh6k.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_pareto0b ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_pareto0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpay_k_h9z.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpay_k_h9z.cpp'], output_dir = '/tmp/tmpbye7gyga' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpbye7gyga/tmp/tmpay_k_h9z.o', ('/tmp/tmpay_k_h9z.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpbye7gyga/tmp/tmpay_k_h9z.o', '/tmp/tmpay_k_h9z.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpbye7gyga/tmp/tmpay_k_h9z.o', src = '/tmp/tmpay_k_h9z.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpay_k_h9z.cpp -o /tmp/tmpbye7gyga/tmp/tmpay_k_h9z.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_pareto0b', cxxfile = '/tmp/tmpay_k_h9z.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp975ifjrz', buildtmp = '/tmp/tmpbye7gyga' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_pareto0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpay_k_h9z.cpp -o /tmp/tmpbye7gyga/tmp/tmpay_k_h9z.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_pareto0b(self): """ Check pareto with 2 argument with mean and variance. """ code = """ def numpy_pareto0b(size): from numpy.random import pareto from numpy import var, mean, sqrt alpha = 6 rvar = alpha/((alpha-1)**2*(alpha-2)) a = pareto(alpha, size) return (abs(mean(a)- 0.5) < 0.05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 6, numpy_pareto0b=[int]) pythran/tests/test_numpy_random.py:978: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_pareto0b', cxxfile = '/tmp/tmpay_k_h9z.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp975ifjrz', buildtmp = '/tmp/tmpbye7gyga' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpay_k_h9z.cpp -o /tmp/tmpbye7gyga/tmp/tmpay_k_h9z.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_pareto0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpbye7gyga/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpay_k_h9z.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpay_k_h9z.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpay_k_h9z.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_pareto2 ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_pareto2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpf01fv04a.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpf01fv04a.cpp'], output_dir = '/tmp/tmp96g_jta1' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp96g_jta1/tmp/tmpf01fv04a.o', ('/tmp/tmpf01fv04a.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp96g_jta1/tmp/tmpf01fv04a.o', '/tmp/tmpf01fv04a.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp96g_jta1/tmp/tmpf01fv04a.o', src = '/tmp/tmpf01fv04a.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpf01fv04a.cpp -o /tmp/tmp96g_jta1/tmp/tmpf01fv04a.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_pareto2', cxxfile = '/tmp/tmpf01fv04a.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmptuts74xx', buildtmp = '/tmp/tmp96g_jta1' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_pareto2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpf01fv04a.cpp -o /tmp/tmp96g_jta1/tmp/tmpf01fv04a.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_pareto2(self): """Check pareto with shape argument with mean and variance.""" code = """ def numpy_pareto2(size): from numpy.random import pareto from numpy import mean, var alpha = 5 rvar = alpha/((alpha-1)**2*(alpha-2)) a = pareto(alpha, size=(size, size)) return (abs(mean(a)- 0.5) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 3, numpy_pareto2=[int]) pythran/tests/test_numpy_random.py:991: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_pareto2', cxxfile = '/tmp/tmpf01fv04a.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmptuts74xx', buildtmp = '/tmp/tmp96g_jta1' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpf01fv04a.cpp -o /tmp/tmp96g_jta1/tmp/tmpf01fv04a.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_pareto2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp96g_jta1/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpf01fv04a.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpf01fv04a.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpf01fv04a.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_poisson0 ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_poisson0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpkh320vuu.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpkh320vuu.cpp'], output_dir = '/tmp/tmptt58p3d4' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmptt58p3d4/tmp/tmpkh320vuu.o', ('/tmp/tmpkh320vuu.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmptt58p3d4/tmp/tmpkh320vuu.o', '/tmp/tmpkh320vuu.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmptt58p3d4/tmp/tmpkh320vuu.o', src = '/tmp/tmpkh320vuu.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpkh320vuu.cpp -o /tmp/tmptt58p3d4/tmp/tmpkh320vuu.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_poisson0', cxxfile = '/tmp/tmpkh320vuu.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpz60n2jjf', buildtmp = '/tmp/tmptt58p3d4' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_poisson0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpkh320vuu.cpp -o /tmp/tmptt58p3d4/tmp/tmpkh320vuu.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_poisson0(self): """ Check poisson without argument with mean and variance. """ code = """ def numpy_poisson0(size): from numpy.random import poisson from numpy import var, mean a = [poisson() for x in range(size)] print(mean(a)) return (abs(mean(a)-1) < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 5, numpy_poisson0=[int]) pythran/tests/test_numpy_random.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_poisson0', cxxfile = '/tmp/tmpkh320vuu.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpz60n2jjf', buildtmp = '/tmp/tmptt58p3d4' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpkh320vuu.cpp -o /tmp/tmptt58p3d4/tmp/tmpkh320vuu.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_poisson0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmptt58p3d4/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpkh320vuu.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpkh320vuu.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpkh320vuu.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_poisson0a _____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_poisson0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmphskfgq5w.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmphskfgq5w.cpp'], output_dir = '/tmp/tmpuxbaldwi' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpuxbaldwi/tmp/tmphskfgq5w.o', ('/tmp/tmphskfgq5w.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpuxbaldwi/tmp/tmphskfgq5w.o', '/tmp/tmphskfgq5w.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpuxbaldwi/tmp/tmphskfgq5w.o', src = '/tmp/tmphskfgq5w.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphskfgq5w.cpp -o /tmp/tmpuxbaldwi/tmp/tmphskfgq5w.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_poisson0a', cxxfile = '/tmp/tmphskfgq5w.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpyf_88cjl', buildtmp = '/tmp/tmpuxbaldwi' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_poisson0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphskfgq5w.cpp -o /tmp/tmpuxbaldwi/tmp/tmphskfgq5w.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_poisson0a(self): """ Check poisson with 1 argument with mean and variance. """ code = """ def numpy_poisson0a(size): from numpy.random import poisson from numpy import var, mean a = [poisson(3.) for x in range(size)] print(mean(a)) return (abs(mean(a)-3) < .05 and abs(var(a) - 3) < .05) """ > self.run_test(code, 10 ** 5, numpy_poisson0a=[int]) pythran/tests/test_numpy_random.py:356: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_poisson0a', cxxfile = '/tmp/tmphskfgq5w.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpyf_88cjl', buildtmp = '/tmp/tmpuxbaldwi' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphskfgq5w.cpp -o /tmp/tmpuxbaldwi/tmp/tmphskfgq5w.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_poisson0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpuxbaldwi/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmphskfgq5w.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmphskfgq5w.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmphskfgq5w.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_poisson0b _____________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_poisson0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpp0uplhyt.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpp0uplhyt.cpp'], output_dir = '/tmp/tmp158xe_9_' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp158xe_9_/tmp/tmpp0uplhyt.o', ('/tmp/tmpp0uplhyt.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp158xe_9_/tmp/tmpp0uplhyt.o', '/tmp/tmpp0uplhyt.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp158xe_9_/tmp/tmpp0uplhyt.o', src = '/tmp/tmpp0uplhyt.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpp0uplhyt.cpp -o /tmp/tmp158xe_9_/tmp/tmpp0uplhyt.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_poisson0b', cxxfile = '/tmp/tmpp0uplhyt.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmptrr22jst', buildtmp = '/tmp/tmp158xe_9_' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_poisson0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpp0uplhyt.cpp -o /tmp/tmp158xe_9_/tmp/tmpp0uplhyt.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_poisson0b(self): """ Check poisson with 2 argument with mean and variance. """ code = """ def numpy_poisson0b(size): from numpy.random import poisson from numpy import var, mean, sqrt lam = 10 a = poisson(lam, size) print(mean(a)) return (abs(mean(a)-lam) < 0.05 and abs(sqrt(lam) - sqrt(var(a,ddof=1))) < .05) """ > self.run_test(code, 10 ** 5, numpy_poisson0b=[int]) pythran/tests/test_numpy_random.py:369: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_poisson0b', cxxfile = '/tmp/tmpp0uplhyt.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmptrr22jst', buildtmp = '/tmp/tmp158xe_9_' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpp0uplhyt.cpp -o /tmp/tmp158xe_9_/tmp/tmpp0uplhyt.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_poisson0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp158xe_9_/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpp0uplhyt.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpp0uplhyt.cpp:27: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpp0uplhyt.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_poisson1 ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_poisson1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp1e0r2wd4.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp1e0r2wd4.cpp'], output_dir = '/tmp/tmphpdt622p' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmphpdt622p/tmp/tmp1e0r2wd4.o', ('/tmp/tmp1e0r2wd4.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmphpdt622p/tmp/tmp1e0r2wd4.o', '/tmp/tmp1e0r2wd4.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmphpdt622p/tmp/tmp1e0r2wd4.o', src = '/tmp/tmp1e0r2wd4.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp1e0r2wd4.cpp -o /tmp/tmphpdt622p/tmp/tmp1e0r2wd4.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_poisson1', cxxfile = '/tmp/tmp1e0r2wd4.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp8f6u_6el', buildtmp = '/tmp/tmphpdt622p' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_poisson1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp1e0r2wd4.cpp -o /tmp/tmphpdt622p/tmp/tmp1e0r2wd4.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_poisson1(self): """ Check poisson with size argument with mean and variance.""" code = """ def numpy_poisson1(size): from numpy.random import poisson from numpy import var, mean a = poisson(size=size) print(mean(a)) return (abs(mean(a)-1) < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 5, numpy_poisson1=[int]) pythran/tests/test_numpy_random.py:383: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_poisson1', cxxfile = '/tmp/tmp1e0r2wd4.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp8f6u_6el', buildtmp = '/tmp/tmphpdt622p' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp1e0r2wd4.cpp -o /tmp/tmphpdt622p/tmp/tmp1e0r2wd4.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_poisson1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmphpdt622p/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp1e0r2wd4.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp1e0r2wd4.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp1e0r2wd4.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _____________________ TestNumpyRandom.test_numpy_poisson2 ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_poisson2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp8hh7u44r.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp8hh7u44r.cpp'], output_dir = '/tmp/tmp_i_7dt8f' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp_i_7dt8f/tmp/tmp8hh7u44r.o', ('/tmp/tmp8hh7u44r.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp_i_7dt8f/tmp/tmp8hh7u44r.o', '/tmp/tmp8hh7u44r.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp_i_7dt8f/tmp/tmp8hh7u44r.o', src = '/tmp/tmp8hh7u44r.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp8hh7u44r.cpp -o /tmp/tmp_i_7dt8f/tmp/tmp8hh7u44r.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_poisson2', cxxfile = '/tmp/tmp8hh7u44r.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp5j6icv37', buildtmp = '/tmp/tmp_i_7dt8f' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_poisson2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp8hh7u44r.cpp -o /tmp/tmp_i_7dt8f/tmp/tmp8hh7u44r.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_poisson2(self): """Check poisson with shape argument with mean and variance.""" code = """ def numpy_poisson2(size): from numpy.random import poisson from numpy import mean, var a = poisson(size=(size, size)) print(mean(a)) return (abs(mean(a)-1) < .05 and abs(var(a) - 1) < .05) """ > self.run_test(code, 10 ** 3, numpy_poisson2=[int]) pythran/tests/test_numpy_random.py:395: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_poisson2', cxxfile = '/tmp/tmp8hh7u44r.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp5j6icv37', buildtmp = '/tmp/tmp_i_7dt8f' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp8hh7u44r.cpp -o /tmp/tmp_i_7dt8f/tmp/tmp8hh7u44r.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_poisson2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp_i_7dt8f/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp8hh7u44r.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmp8hh7u44r.cpp:25: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp8hh7u44r.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_power0a ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_power0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmplztfp4y9.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmplztfp4y9.cpp'], output_dir = '/tmp/tmpq7gu95s8' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpq7gu95s8/tmp/tmplztfp4y9.o', ('/tmp/tmplztfp4y9.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpq7gu95s8/tmp/tmplztfp4y9.o', '/tmp/tmplztfp4y9.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpq7gu95s8/tmp/tmplztfp4y9.o', src = '/tmp/tmplztfp4y9.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmplztfp4y9.cpp -o /tmp/tmpq7gu95s8/tmp/tmplztfp4y9.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_power0a', cxxfile = '/tmp/tmplztfp4y9.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp0vywga_m', buildtmp = '/tmp/tmpq7gu95s8' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_power0a', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmplztfp4y9.cpp -o /tmp/tmpq7gu95s8/tmp/tmplztfp4y9.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_power0a(self): """ Check power with 1 argument with mean and variance. """ code = """ def numpy_power0a(size): from numpy.random import power from numpy import var, mean alpha = 1 rmean = alpha / (alpha + 1) rvar = alpha/((alpha+1)**2*(alpha+2)) a = [power(alpha) for x in range(size)] return (abs(mean(a)- rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 6, numpy_power0a=[int]) pythran/tests/test_numpy_random.py:1009: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_power0a', cxxfile = '/tmp/tmplztfp4y9.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp0vywga_m', buildtmp = '/tmp/tmpq7gu95s8' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmplztfp4y9.cpp -o /tmp/tmpq7gu95s8/tmp/tmplztfp4y9.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_power0a' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpq7gu95s8/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmplztfp4y9.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmplztfp4y9.cpp:29: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmplztfp4y9.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_power0b ______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_power0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpuej309lp.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpuej309lp.cpp'], output_dir = '/tmp/tmpnk6hdbc4' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpnk6hdbc4/tmp/tmpuej309lp.o', ('/tmp/tmpuej309lp.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpnk6hdbc4/tmp/tmpuej309lp.o', '/tmp/tmpuej309lp.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpnk6hdbc4/tmp/tmpuej309lp.o', src = '/tmp/tmpuej309lp.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpuej309lp.cpp -o /tmp/tmpnk6hdbc4/tmp/tmpuej309lp.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_power0b', cxxfile = '/tmp/tmpuej309lp.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpd1ac3lv8', buildtmp = '/tmp/tmpnk6hdbc4' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_power0b', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpuej309lp.cpp -o /tmp/tmpnk6hdbc4/tmp/tmpuej309lp.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_power0b(self): """ Check power with 2 argument with mean and variance. """ code = """ def numpy_power0b(size): from numpy.random import power from numpy import var, mean, sqrt alpha = 1 rmean = alpha / (alpha + 1) rvar = alpha/((alpha+1)**2*(alpha+2)) a = power(alpha, size) return (abs(mean(a)- rmean) < 0.05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 6, numpy_power0b=[int]) pythran/tests/test_numpy_random.py:1023: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_power0b', cxxfile = '/tmp/tmpuej309lp.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpd1ac3lv8', buildtmp = '/tmp/tmpnk6hdbc4' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpuej309lp.cpp -o /tmp/tmpnk6hdbc4/tmp/tmpuej309lp.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_power0b' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpnk6hdbc4/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpuej309lp.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpuej309lp.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpuej309lp.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_power2 _______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_power2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmptdxuvx1p.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmptdxuvx1p.cpp'], output_dir = '/tmp/tmpei1mt_2m' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpei1mt_2m/tmp/tmptdxuvx1p.o', ('/tmp/tmptdxuvx1p.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpei1mt_2m/tmp/tmptdxuvx1p.o', '/tmp/tmptdxuvx1p.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpei1mt_2m/tmp/tmptdxuvx1p.o', src = '/tmp/tmptdxuvx1p.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmptdxuvx1p.cpp -o /tmp/tmpei1mt_2m/tmp/tmptdxuvx1p.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_power2', cxxfile = '/tmp/tmptdxuvx1p.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpy_48kix2', buildtmp = '/tmp/tmpei1mt_2m' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_power2', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmptdxuvx1p.cpp -o /tmp/tmpei1mt_2m/tmp/tmptdxuvx1p.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_power2(self): """Check power with shape argument with mean and variance.""" code = """ def numpy_power2(size): from numpy.random import power from numpy import mean, var alpha = 1 rmean = alpha / (alpha + 1) rvar = alpha/((alpha+1)**2*(alpha+2)) a = power(alpha, size=(size, size)) return (abs(mean(a)- rmean) < .05 and abs(var(a) - rvar) < .05) """ > self.run_test(code, 10 ** 3, numpy_power2=[int]) pythran/tests/test_numpy_random.py:1037: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_power2', cxxfile = '/tmp/tmptdxuvx1p.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpy_48kix2', buildtmp = '/tmp/tmpei1mt_2m' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmptdxuvx1p.cpp -o /tmp/tmpei1mt_2m/tmp/tmptdxuvx1p.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_power2' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpei1mt_2m/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmptdxuvx1p.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmptdxuvx1p.cpp:23: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmptdxuvx1p.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_randn0 _______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_randn0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpxfos9ob4.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpxfos9ob4.cpp'], output_dir = '/tmp/tmpd90i2_ys' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpd90i2_ys/tmp/tmpxfos9ob4.o', ('/tmp/tmpxfos9ob4.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpd90i2_ys/tmp/tmpxfos9ob4.o', '/tmp/tmpxfos9ob4.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpd90i2_ys/tmp/tmpxfos9ob4.o', src = '/tmp/tmpxfos9ob4.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxfos9ob4.cpp -o /tmp/tmpd90i2_ys/tmp/tmpxfos9ob4.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_randn0', cxxfile = '/tmp/tmpxfos9ob4.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpri4oupys', buildtmp = '/tmp/tmpd90i2_ys' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_randn0', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxfos9ob4.cpp -o /tmp/tmpd90i2_ys/tmp/tmpxfos9ob4.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_randn0(self): """ Check numpy randn without arguments. """ > self.run_test(""" def numpy_randn0(n): from numpy.random import randn from numpy import mean, var a = [randn() for x in range(n)] return (abs(mean(a)) < .05 and abs(var(a) - 1) < .05)""", 10 ** 5, numpy_randn0=[int]) pythran/tests/test_numpy_random.py:403: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_randn0', cxxfile = '/tmp/tmpxfos9ob4.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpri4oupys', buildtmp = '/tmp/tmpd90i2_ys' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxfos9ob4.cpp -o /tmp/tmpd90i2_ys/tmp/tmpxfos9ob4.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_randn0' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpd90i2_ys/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpxfos9ob4.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpxfos9ob4.cpp:27: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpxfos9ob4.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! ______________________ TestNumpyRandom.test_numpy_randn1 _______________________ [gw7] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_randn1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpixdlz74k.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpixdlz74k.cpp'], output_dir = '/tmp/tmpx2tyb858' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpx2tyb858/tmp/tmpixdlz74k.o', ('/tmp/tmpixdlz74k.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpx2tyb858/tmp/tmpixdlz74k.o', '/tmp/tmpixdlz74k.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpx2tyb858/tmp/tmpixdlz74k.o', src = '/tmp/tmpixdlz74k.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpixdlz74k.cpp -o /tmp/tmpx2tyb858/tmp/tmpixdlz74k.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_randn1', cxxfile = '/tmp/tmpixdlz74k.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpeij7gpo3', buildtmp = '/tmp/tmpx2tyb858' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_randn1', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpixdlz74k.cpp -o /tmp/tmpx2tyb858/tmp/tmpixdlz74k.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def test_numpy_randn1(self): """ Check numpy randn with multiple arguments. """ > self.run_test(""" def numpy_randn1(n): from numpy.random import randn from numpy import mean, var a = randn(n, n) return (abs(mean(a)) < .05 and abs(var(a) - 1) < .05)""", 10 ** 3, numpy_randn1=[int]) pythran/tests/test_numpy_random.py:413: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_randn1', cxxfile = '/tmp/tmpixdlz74k.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpeij7gpo3', buildtmp = '/tmp/tmpx2tyb858' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpixdlz74k.cpp -o /tmp/tmpx2tyb858/tmp/tmpixdlz74k.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_randn1' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpx2tyb858/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpixdlz74k.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /tmp/tmpixdlz74k.cpp:21: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpixdlz74k.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _ TestNumpyUFuncUnary.test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conj_complex _ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conj_complex', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpxjr49cjx.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpxjr49cjx.cpp'], output_dir = '/tmp/tmphkb6duz8' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmphkb6duz8/tmp/tmpxjr49cjx.o', ('/tmp/tmpxjr49cjx.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmphkb6duz8/tmp/tmpxjr49cjx.o', '/tmp/tmpxjr49cjx.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmphkb6duz8/tmp/tmpxjr49cjx.o', src = '/tmp/tmpxjr49cjx.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxjr49cjx.cpp -o /tmp/tmphkb6duz8/tmp/tmpxjr49cjx.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_ufunc_unary_numpy_conj_complex' cxxfile = '/tmp/tmpxjr49cjx.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpx7qlomgf', buildtmp = '/tmp/tmphkb6duz8' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conj_complex', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxjr49cjx.cpp -o /tmp/tmphkb6duz8/tmp/tmpxjr49cjx.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = > ??? :1: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_ufunc_unary_numpy_conj_complex' cxxfile = '/tmp/tmpxjr49cjx.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpx7qlomgf', buildtmp = '/tmp/tmphkb6duz8' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpxjr49cjx.cpp -o /tmp/tmphkb6duz8/tmp/tmpxjr49cjx.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_ufunc_unary_numpy_conj_complex' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmphkb6duz8/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpxjr49cjx.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /tmp/tmpxjr49cjx.cpp:12: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpxjr49cjx.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _ TestNumpyUFuncUnary.test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conj_float _ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conj_float', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp8ck0pa68.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp8ck0pa68.cpp'], output_dir = '/tmp/tmpg46owjj9' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpg46owjj9/tmp/tmp8ck0pa68.o', ('/tmp/tmp8ck0pa68.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpg46owjj9/tmp/tmp8ck0pa68.o', '/tmp/tmp8ck0pa68.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpg46owjj9/tmp/tmp8ck0pa68.o', src = '/tmp/tmp8ck0pa68.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp8ck0pa68.cpp -o /tmp/tmpg46owjj9/tmp/tmp8ck0pa68.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_ufunc_unary_numpy_conj_float' cxxfile = '/tmp/tmp8ck0pa68.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp5p02bt5_', buildtmp = '/tmp/tmpg46owjj9' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conj_float', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp8ck0pa68.cpp -o /tmp/tmpg46owjj9/tmp/tmp8ck0pa68.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = > ??? :1: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_ufunc_unary_numpy_conj_float' cxxfile = '/tmp/tmp8ck0pa68.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp5p02bt5_', buildtmp = '/tmp/tmpg46owjj9' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp8ck0pa68.cpp -o /tmp/tmpg46owjj9/tmp/tmp8ck0pa68.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_ufunc_unary_numpy_conj_float' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpg46owjj9/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp8ck0pa68.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /tmp/tmp8ck0pa68.cpp:12: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp8ck0pa68.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _ TestNumpyUFuncUnary.test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conjugate_float _ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conjugate_float', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpzcujmdgy.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpzcujmdgy.cpp'], output_dir = '/tmp/tmpqilrnavn' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpqilrnavn/tmp/tmpzcujmdgy.o', ('/tmp/tmpzcujmdgy.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpqilrnavn/tmp/tmpzcujmdgy.o', '/tmp/tmpzcujmdgy.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpqilrnavn/tmp/tmpzcujmdgy.o', src = '/tmp/tmpzcujmdgy.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzcujmdgy.cpp -o /tmp/tmpqilrnavn/tmp/tmpzcujmdgy.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_ufunc_unary_numpy_conjugate_float' cxxfile = '/tmp/tmpzcujmdgy.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpalppa0_v', buildtmp = '/tmp/tmpqilrnavn' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conjugate_float', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzcujmdgy.cpp -o /tmp/tmpqilrnavn/tmp/tmpzcujmdgy.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = > ??? :1: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_ufunc_unary_numpy_conjugate_float' cxxfile = '/tmp/tmpzcujmdgy.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpalppa0_v', buildtmp = '/tmp/tmpqilrnavn' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpzcujmdgy.cpp -o /tmp/tmpqilrnavn/tmp/tmpzcujmdgy.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_ufunc_unary_numpy_conjugate_float' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpqilrnavn/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpzcujmdgy.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /tmp/tmpzcujmdgy.cpp:12: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpzcujmdgy.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _ TestNumpyUFuncUnary.test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conj_matrix_complex _ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conj_matrix_complex', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmphsojmq1o.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmphsojmq1o.cpp'], output_dir = '/tmp/tmp08vk8v4h' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp08vk8v4h/tmp/tmphsojmq1o.o', ('/tmp/tmphsojmq1o.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp08vk8v4h/tmp/tmphsojmq1o.o', '/tmp/tmphsojmq1o.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp08vk8v4h/tmp/tmphsojmq1o.o', src = '/tmp/tmphsojmq1o.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphsojmq1o.cpp -o /tmp/tmp08vk8v4h/tmp/tmphsojmq1o.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_ufunc_unary_numpy_conj_matrix_complex' cxxfile = '/tmp/tmphsojmq1o.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpamecbbr6', buildtmp = '/tmp/tmp08vk8v4h' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conj_matrix_complex', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphsojmq1o.cpp -o /tmp/tmp08vk8v4h/tmp/tmphsojmq1o.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = > ??? :1: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_ufunc_unary_numpy_conj_matrix_complex' cxxfile = '/tmp/tmphsojmq1o.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpamecbbr6', buildtmp = '/tmp/tmp08vk8v4h' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmphsojmq1o.cpp -o /tmp/tmp08vk8v4h/tmp/tmphsojmq1o.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_ufunc_unary_numpy_conj_matrix_complex' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp08vk8v4h/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmphsojmq1o.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /tmp/tmphsojmq1o.cpp:12: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmphsojmq1o.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _ TestNumpyUFuncUnary.test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conjugate_matrix_complex _ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conjugate_matrix_complex', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpjeum34bi.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpjeum34bi.cpp'], output_dir = '/tmp/tmp9v1ze7yx' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp9v1ze7yx/tmp/tmpjeum34bi.o', ('/tmp/tmpjeum34bi.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp9v1ze7yx/tmp/tmpjeum34bi.o', '/tmp/tmpjeum34bi.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp9v1ze7yx/tmp/tmpjeum34bi.o', src = '/tmp/tmpjeum34bi.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpjeum34bi.cpp -o /tmp/tmp9v1ze7yx/tmp/tmpjeum34bi.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_ufunc_unary_numpy_conjugate_matrix_complex' cxxfile = '/tmp/tmpjeum34bi.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp1e36clq5', buildtmp = '/tmp/tmp9v1ze7yx' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conjugate_matrix_complex', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpjeum34bi.cpp -o /tmp/tmp9v1ze7yx/tmp/tmpjeum34bi.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = > ??? :1: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_ufunc_unary_numpy_conjugate_matrix_complex' cxxfile = '/tmp/tmpjeum34bi.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp1e36clq5', buildtmp = '/tmp/tmp9v1ze7yx' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpjeum34bi.cpp -o /tmp/tmp9v1ze7yx/tmp/tmpjeum34bi.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_ufunc_unary_numpy_conjugate_matrix_complex' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp9v1ze7yx/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpjeum34bi.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /tmp/tmpjeum34bi.cpp:12: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpjeum34bi.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _ TestNumpyUFuncUnary.test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conj_matrix_float _ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conj_matrix_float', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpi92qvrh7.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpi92qvrh7.cpp'], output_dir = '/tmp/tmp9fvv5pn8' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp9fvv5pn8/tmp/tmpi92qvrh7.o', ('/tmp/tmpi92qvrh7.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp9fvv5pn8/tmp/tmpi92qvrh7.o', '/tmp/tmpi92qvrh7.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp9fvv5pn8/tmp/tmpi92qvrh7.o', src = '/tmp/tmpi92qvrh7.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpi92qvrh7.cpp -o /tmp/tmp9fvv5pn8/tmp/tmpi92qvrh7.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_ufunc_unary_numpy_conj_matrix_float' cxxfile = '/tmp/tmpi92qvrh7.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp8r0b2flp', buildtmp = '/tmp/tmp9fvv5pn8' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conj_matrix_float', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpi92qvrh7.cpp -o /tmp/tmp9fvv5pn8/tmp/tmpi92qvrh7.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = > ??? :1: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_ufunc_unary_numpy_conj_matrix_float' cxxfile = '/tmp/tmpi92qvrh7.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp8r0b2flp', buildtmp = '/tmp/tmp9fvv5pn8' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpi92qvrh7.cpp -o /tmp/tmp9fvv5pn8/tmp/tmpi92qvrh7.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_ufunc_unary_numpy_conj_matrix_float' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp9fvv5pn8/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpi92qvrh7.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /tmp/tmpi92qvrh7.cpp:12: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpi92qvrh7.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _ TestNumpyUFuncUnary.test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conjugate_matrix_float _ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conjugate_matrix_float', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpqwlh3zl1.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpqwlh3zl1.cpp'], output_dir = '/tmp/tmp0jpus6w_' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp0jpus6w_/tmp/tmpqwlh3zl1.o', ('/tmp/tmpqwlh3zl1.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp0jpus6w_/tmp/tmpqwlh3zl1.o', '/tmp/tmpqwlh3zl1.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp0jpus6w_/tmp/tmpqwlh3zl1.o', src = '/tmp/tmpqwlh3zl1.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpqwlh3zl1.cpp -o /tmp/tmp0jpus6w_/tmp/tmpqwlh3zl1.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_ufunc_unary_numpy_conjugate_matrix_float' cxxfile = '/tmp/tmpqwlh3zl1.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpl4tg9lyw', buildtmp = '/tmp/tmp0jpus6w_' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conjugate_matrix_float', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpqwlh3zl1.cpp -o /tmp/tmp0jpus6w_/tmp/tmpqwlh3zl1.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = > ??? :1: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_ufunc_unary_numpy_conjugate_matrix_float' cxxfile = '/tmp/tmpqwlh3zl1.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpl4tg9lyw', buildtmp = '/tmp/tmp0jpus6w_' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpqwlh3zl1.cpp -o /tmp/tmp0jpus6w_/tmp/tmpqwlh3zl1.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_ufunc_unary_numpy_conjugate_matrix_float' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp0jpus6w_/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpqwlh3zl1.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /tmp/tmpqwlh3zl1.cpp:12: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpqwlh3zl1.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _ TestNumpyUFuncUnary.test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conj_scalar_complex _ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conj_scalar_complex', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpnxhnum5x.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpnxhnum5x.cpp'], output_dir = '/tmp/tmp29vuh76i' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp29vuh76i/tmp/tmpnxhnum5x.o', ('/tmp/tmpnxhnum5x.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp29vuh76i/tmp/tmpnxhnum5x.o', '/tmp/tmpnxhnum5x.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp29vuh76i/tmp/tmpnxhnum5x.o', src = '/tmp/tmpnxhnum5x.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpnxhnum5x.cpp -o /tmp/tmp29vuh76i/tmp/tmpnxhnum5x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_ufunc_unary_numpy_conj_scalar_complex' cxxfile = '/tmp/tmpnxhnum5x.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpn80mkkg6', buildtmp = '/tmp/tmp29vuh76i' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conj_scalar_complex', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpnxhnum5x.cpp -o /tmp/tmp29vuh76i/tmp/tmpnxhnum5x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = > ??? :1: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_ufunc_unary_numpy_conj_scalar_complex' cxxfile = '/tmp/tmpnxhnum5x.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpn80mkkg6', buildtmp = '/tmp/tmp29vuh76i' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpnxhnum5x.cpp -o /tmp/tmp29vuh76i/tmp/tmpnxhnum5x.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_ufunc_unary_numpy_conj_scalar_complex' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp29vuh76i/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpnxhnum5x.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /tmp/tmpnxhnum5x.cpp:10: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpnxhnum5x.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _ TestNumpyUFuncUnary.test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conjugate_scalar_complex _ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conjugate_scalar_complex', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp2zigxtqf.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp2zigxtqf.cpp'], output_dir = '/tmp/tmpsijhjq9l' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpsijhjq9l/tmp/tmp2zigxtqf.o', ('/tmp/tmp2zigxtqf.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpsijhjq9l/tmp/tmp2zigxtqf.o', '/tmp/tmp2zigxtqf.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpsijhjq9l/tmp/tmp2zigxtqf.o', src = '/tmp/tmp2zigxtqf.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp2zigxtqf.cpp -o /tmp/tmpsijhjq9l/tmp/tmp2zigxtqf.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_ufunc_unary_numpy_conjugate_scalar_complex' cxxfile = '/tmp/tmp2zigxtqf.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpu0gh2kbn', buildtmp = '/tmp/tmpsijhjq9l' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conjugate_scalar_complex', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp2zigxtqf.cpp -o /tmp/tmpsijhjq9l/tmp/tmp2zigxtqf.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = > ??? :1: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_ufunc_unary_numpy_conjugate_scalar_complex' cxxfile = '/tmp/tmp2zigxtqf.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpu0gh2kbn', buildtmp = '/tmp/tmpsijhjq9l' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp2zigxtqf.cpp -o /tmp/tmpsijhjq9l/tmp/tmp2zigxtqf.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_ufunc_unary_numpy_conjugate_scalar_complex' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpsijhjq9l/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp2zigxtqf.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /tmp/tmp2zigxtqf.cpp:10: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp2zigxtqf.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _ TestNumpyUFuncUnary.test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conjugate_scalar_float _ [gw0] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conjugate_scalar_float', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpwhi7rnt_.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpwhi7rnt_.cpp'], output_dir = '/tmp/tmpb97b2jt2' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpb97b2jt2/tmp/tmpwhi7rnt_.o', ('/tmp/tmpwhi7rnt_.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpb97b2jt2/tmp/tmpwhi7rnt_.o', '/tmp/tmpwhi7rnt_.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpb97b2jt2/tmp/tmpwhi7rnt_.o', src = '/tmp/tmpwhi7rnt_.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpwhi7rnt_.cpp -o /tmp/tmpb97b2jt2/tmp/tmpwhi7rnt_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_ufunc_unary_numpy_conjugate_scalar_float' cxxfile = '/tmp/tmpwhi7rnt_.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp2z9t72fh', buildtmp = '/tmp/tmpb97b2jt2' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conjugate_scalar_float', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpwhi7rnt_.cpp -o /tmp/tmpb97b2jt2/tmp/tmpwhi7rnt_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = > ??? :1: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_ufunc_unary_numpy_conjugate_scalar_float' cxxfile = '/tmp/tmpwhi7rnt_.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp2z9t72fh', buildtmp = '/tmp/tmpb97b2jt2' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpwhi7rnt_.cpp -o /tmp/tmpb97b2jt2/tmp/tmpwhi7rnt_.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_ufunc_unary_numpy_conjugate_scalar_float' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpb97b2jt2/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpwhi7rnt_.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /tmp/tmpwhi7rnt_.cpp:10: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpwhi7rnt_.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _ TestNumpyUFuncUnary.test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conj_scalar_float _ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conj_scalar_float', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpj94e4dah.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpj94e4dah.cpp'], output_dir = '/tmp/tmpfq8_is_v' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpfq8_is_v/tmp/tmpj94e4dah.o', ('/tmp/tmpj94e4dah.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpfq8_is_v/tmp/tmpj94e4dah.o', '/tmp/tmpj94e4dah.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpfq8_is_v/tmp/tmpj94e4dah.o', src = '/tmp/tmpj94e4dah.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpj94e4dah.cpp -o /tmp/tmpfq8_is_v/tmp/tmpj94e4dah.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_ufunc_unary_numpy_conj_scalar_float' cxxfile = '/tmp/tmpj94e4dah.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpp1ez7l2x', buildtmp = '/tmp/tmpfq8_is_v' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conj_scalar_float', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpj94e4dah.cpp -o /tmp/tmpfq8_is_v/tmp/tmpj94e4dah.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = > ??? :1: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_ufunc_unary_numpy_conj_scalar_float' cxxfile = '/tmp/tmpj94e4dah.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpp1ez7l2x', buildtmp = '/tmp/tmpfq8_is_v' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpj94e4dah.cpp -o /tmp/tmpfq8_is_v/tmp/tmpj94e4dah.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_ufunc_unary_numpy_conj_scalar_float' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpfq8_is_v/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpj94e4dah.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /tmp/tmpj94e4dah.cpp:10: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpj94e4dah.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _ TestNumpyUFuncUnary.test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conjugate_complex _ [gw1] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conjugate_complex', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmp1k1hlc3k.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmp1k1hlc3k.cpp'], output_dir = '/tmp/tmp0iuntzxw' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmp0iuntzxw/tmp/tmp1k1hlc3k.o', ('/tmp/tmp1k1hlc3k.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmp0iuntzxw/tmp/tmp1k1hlc3k.o', '/tmp/tmp1k1hlc3k.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmp0iuntzxw/tmp/tmp1k1hlc3k.o', src = '/tmp/tmp1k1hlc3k.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp1k1hlc3k.cpp -o /tmp/tmp0iuntzxw/tmp/tmp1k1hlc3k.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = 'test_numpy_ufunc_unary_numpy_conjugate_complex' cxxfile = '/tmp/tmp1k1hlc3k.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp_n380mhw', buildtmp = '/tmp/tmp0iuntzxw' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': 'test_numpy_ufunc_unary_numpy_conjugate_complex', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp1k1hlc3k.cpp -o /tmp/tmp0iuntzxw/tmp/tmp1k1hlc3k.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = > ??? :1: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:312: in run_test cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = 'test_numpy_ufunc_unary_numpy_conjugate_complex' cxxfile = '/tmp/tmp1k1hlc3k.cpp', output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmp_n380mhw', buildtmp = '/tmp/tmp0iuntzxw' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmp1k1hlc3k.cpp -o /tmp/tmp0iuntzxw/tmp/tmp1k1hlc3k.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building 'test_numpy_ufunc_unary_numpy_conjugate_complex' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmp0iuntzxw/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmp1k1hlc3k.cpp ----------------------------- Captured stderr call ----------------------------- cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /tmp/tmp1k1hlc3k.cpp:12: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmp1k1hlc3k.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _________________ TestScipy.test__calc_binned_statistic_norun0 _________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': '_calc_binned_statistic_norun00', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpcxsh1btl.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpcxsh1btl.cpp'], output_dir = '/tmp/tmpq3z3d40y' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpq3z3d40y/tmp/tmpcxsh1btl.o', ('/tmp/tmpcxsh1btl.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpq3z3d40y/tmp/tmpcxsh1btl.o', '/tmp/tmpcxsh1btl.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpq3z3d40y/tmp/tmpcxsh1btl.o', src = '/tmp/tmpcxsh1btl.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpcxsh1btl.cpp -o /tmp/tmpq3z3d40y/tmp/tmpcxsh1btl.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = '_calc_binned_statistic_norun00', cxxfile = '/tmp/tmpcxsh1btl.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpfcprr7a0', buildtmp = '/tmp/tmpq3z3d40y' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': '_calc_binned_statistic_norun00', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpcxsh1btl.cpp -o /tmp/tmpq3z3d40y/tmp/tmpcxsh1btl.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def __call__(self): if "unittest.skip" in self.module_code: return self.test_env.skipTest("Marked as skippable") if "unittest.python3.skip" in self.module_code: return self.test_env.skipTest("Marked as skippable") # resolve import locally to where the tests are located sys.path.insert(0, self.test_env.path) > self.test_env.run_test_case(self.module_code, self.module_name, self.runas, module_dir=self.module_dir, **self.specs) pythran/tests/__init__.py:470: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:263: in run_test_case cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = '_calc_binned_statistic_norun00', cxxfile = '/tmp/tmpcxsh1btl.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmpfcprr7a0', buildtmp = '/tmp/tmpq3z3d40y' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpcxsh1btl.cpp -o /tmp/tmpq3z3d40y/tmp/tmpcxsh1btl.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building '_calc_binned_statistic_norun00' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpq3z3d40y/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpcxsh1btl.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conjugate.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/var.hpp:11, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/std_.hpp:5, from /tmp/tmpcxsh1btl.cpp:51: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpcxsh1btl.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! _______________________ TestScipy.test__rbfinterp_norun0 _______________________ [gw5] linux -- Python 3.10.1 /usr/bin/python3 attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': '_rbfinterp_norun00', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: > dist.run_commands() /usr/lib64/python3.10/distutils/core.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run_commands(self): """Run each command that was seen on the setup script command line. Uses the list of commands found and cache of command objects created by 'get_command_obj()'. """ for cmd in self.commands: > self.run_command(cmd) /usr/lib64/python3.10/distutils/dist.py:966: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = command = 'build_ext' def run_command(self, command): """Do whatever it takes to run a command (including nothing at all, if the command has already been run). Specifically: if we have already created and run the command named by 'command', return silently without doing anything. If the command named by 'command' doesn't even have a command object yet, create one. Then invoke 'run()' on that command object (or an existing one). """ # Already been here, done that? then return silently. if self.have_run.get(command): return log.info("running %s", command) cmd_obj = self.get_command_obj(command) cmd_obj.ensure_finalized() > cmd_obj.run() /usr/lib64/python3.10/distutils/dist.py:985: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def run(self): from distutils.ccompiler import new_compiler # 'self.extensions', as supplied by setup.py, is a list of # Extension instances. See the documentation for Extension (in # distutils.extension) for details. # # For backwards compatibility with Distutils 0.8.2 and earlier, we # also allow the 'extensions' list to be a list of tuples: # (ext_name, build_info) # where build_info is a dictionary containing everything that # Extension instances do except the name, with a few things being # differently named. We convert these 2-tuples to Extension # instances as needed. if not self.extensions: return # If we were asked to build any C/C++ libraries, make sure that the # directory where we put them is in the library search path for # linking extensions. if self.distribution.has_c_libraries(): build_clib = self.get_finalized_command('build_clib') self.libraries.extend(build_clib.get_library_names() or []) self.library_dirs.append(build_clib.build_clib) # Setup the CCompiler object that we'll use to do all the # compiling and linking self.compiler = new_compiler(compiler=self.compiler, verbose=self.verbose, dry_run=self.dry_run, force=self.force) customize_compiler(self.compiler) # If we are cross-compiling, init the compiler now (if we are not # cross-compiling, init would not hurt, but people may rely on # late initialization of compiler even if they shouldn't...) if os.name == 'nt' and self.plat_name != get_platform(): self.compiler.initialize(self.plat_name) # And make sure that any compile/link-related options (which might # come from the command-line or from the setup script) are set in # that CCompiler object -- that way, they automatically apply to # all compiling and linking done here. if self.include_dirs is not None: self.compiler.set_include_dirs(self.include_dirs) if self.define is not None: # 'define' option is a list of (name,value) tuples for (name, value) in self.define: self.compiler.define_macro(name, value) if self.undef is not None: for macro in self.undef: self.compiler.undefine_macro(macro) if self.libraries is not None: self.compiler.set_libraries(self.libraries) if self.library_dirs is not None: self.compiler.set_library_dirs(self.library_dirs) if self.rpath is not None: self.compiler.set_runtime_library_dirs(self.rpath) if self.link_objects is not None: self.compiler.set_link_objects(self.link_objects) # Now actually compile and link everything. > self.build_extensions() /usr/lib64/python3.10/distutils/command/build_ext.py:340: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def build_extensions(self): # First, sanity-check the 'extensions' list self.check_extensions_list(self.extensions) if self.parallel: self._build_extensions_parallel() else: > self._build_extensions_serial() /usr/lib64/python3.10/distutils/command/build_ext.py:449: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _build_extensions_serial(self): for ext in self.extensions: with self._filter_build_errors(ext): > self.build_extension(ext) /usr/lib64/python3.10/distutils/command/build_ext.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): StringTypes = str, def get_value(obj, key): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): return var[0] else: return var def set_value(obj, key, value): var = getattr(obj, key) if isinstance(var, Iterable) and not isinstance(var, StringTypes): var[0] = value else: setattr(obj, key, value) prev = { # linux-like 'preprocessor': None, 'compiler_cxx': None, 'compiler_so': None, 'compiler': None, 'linker_exe': None, 'linker_so': None, # Windows-like 'cc': None, } # Backup compiler settings for key in list(prev.keys()): if hasattr(self.compiler, key): prev[key] = get_value(self.compiler, key) else: del prev[key] # try hard to modify the compiler if getattr(ext, 'cxx', None) is not None: for comp in prev: if hasattr(self.compiler, comp): set_value(self.compiler, comp, ext.cxx) find_exe = None if getattr(ext, 'cc', None) is not None: try: import distutils._msvccompiler as msvc # install hook find_exe = msvc._find_exe def _find_exe(exe, *args, **kwargs): if exe == 'cl.exe': exe = ext.cc return find_exe(exe, *args, **kwargs) msvc._find_exe = _find_exe except ImportError: pass # In general, distutils uses -Wstrict-prototypes, but this option # is not valid for C++ code, only for C. Remove it if it's there # to avoid a spurious warning on every compilation. for flag in cfg.cfg.get('compiler', "ignoreflags").split(): for target in ('compiler_so', 'linker_so'): try: while True: getattr(self.compiler, target).remove(flag) except (AttributeError, ValueError): pass # Remove -arch i386 if 'x86_64' is specified, otherwise incorrect # code is generated, at least on OSX if hasattr(self.compiler, 'compiler_so'): archs = defaultdict(list) for i, flag in enumerate(self.compiler.compiler_so[1:]): if self.compiler.compiler_so[i] == '-arch': archs[flag].append(i + 1) if 'x86_64' in archs and 'i386' in archs: for i in archs['i386']: self.compiler.compiler_so[i] = 'x86_64' try: > return super(PythranBuildExtMixIn, self).build_extension(ext) pythran/dist.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ext = def build_extension(self, ext): sources = ext.sources if sources is None or not isinstance(sources, (list, tuple)): raise DistutilsSetupError( "in 'ext_modules' option (extension '%s'), " "'sources' must be present and must be " "a list of source filenames" % ext.name) # sort to make the resulting .so file build reproducible sources = sorted(sources) ext_path = self.get_ext_fullpath(ext.name) depends = sources + ext.depends if not (self.force or newer_group(depends, ext_path, 'newer')): log.debug("skipping '%s' extension (up-to-date)", ext.name) return else: log.info("building '%s' extension", ext.name) # First, scan the sources for SWIG definition files (.i), run # SWIG on 'em to create .c files, and modify the sources list # accordingly. sources = self.swig_sources(sources, ext) # Next, compile the source code to object files. # XXX not honouring 'define_macros' or 'undef_macros' -- the # CCompiler API needs to change to accommodate this, and I # want to do one thing at a time! # Two possible sources for extra compiler arguments: # - 'extra_compile_args' in Extension object # - CFLAGS environment variable (not particularly # elegant, but people seem to expect it and I # guess it's useful) # The environment variable should take precedence, and # any sensible compiler will give precedence to later # command line args. Hence we combine them in order: extra_args = ext.extra_compile_args or [] macros = ext.define_macros[:] for undef in ext.undef_macros: macros.append((undef,)) > objects = self.compiler.compile(sources, output_dir=self.build_temp, macros=macros, include_dirs=ext.include_dirs, debug=self.debug, extra_postargs=extra_args, depends=ext.depends) /usr/lib64/python3.10/distutils/command/build_ext.py:529: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = (['/tmp/tmpx_97m4xm.cpp'],) kw = {'debug': None, 'depends': [], 'extra_postargs': ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv'...'/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'], ...} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = sources = ['/tmp/tmpx_97m4xm.cpp'], output_dir = '/tmp/tmpraat08vq' macros = [('ENABLE_PYTHON_MODULE', None), ('__PYTHRAN__', '3'), ('PYTHRAN_BLAS_OPENBLAS', None)] include_dirs = ['/usr/include/flexiblas', '/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '/usr/lib64/python3.10/site-packages/numpy/core/include'] debug = None, extra_preargs = None extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] depends = [] def CCompiler_compile(self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None): """ Compile one or more source files. Please refer to the Python distutils API reference for more details. Parameters ---------- sources : list of str A list of filenames output_dir : str, optional Path to the output directory. macros : list of tuples A list of macro definitions. include_dirs : list of str, optional The directories to add to the default include file search path for this compilation only. debug : bool, optional Whether or not to output debug symbols in or alongside the object file(s). extra_preargs, extra_postargs : ? Extra pre- and post-arguments. depends : list of str, optional A list of file names that all targets depend on. Returns ------- objects : list of str A list of object file names, one per source file `sources`. Raises ------ CompileError If compilation fails. """ # This method is effective only with Python >=2.3 distutils. # Any changes here should be applied also to fcompiler.compile # method to support pre Python 2.3 distutils. global _job_semaphore jobs = get_num_build_jobs() # setup semaphore to not exceed number of compile jobs when parallelized at # extension level (python >= 3.5) with _global_lock: if _job_semaphore is None: _job_semaphore = threading.Semaphore(jobs) if not sources: return [] from numpy.distutils.fcompiler import (FCompiler, is_f_file, has_f90_header) if isinstance(self, FCompiler): display = [] for fc in ['f77', 'f90', 'fix']: fcomp = getattr(self, 'compiler_'+fc) if fcomp is None: continue display.append("Fortran %s compiler: %s" % (fc, ' '.join(fcomp))) display = '\n'.join(display) else: ccomp = self.compiler_so display = "C compiler: %s\n" % (' '.join(ccomp),) log.info(display) macros, objects, extra_postargs, pp_opts, build = \ self._setup_compile(output_dir, macros, include_dirs, sources, depends, extra_postargs) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) display = "compile options: '%s'" % (' '.join(cc_args)) if extra_postargs: display += "\nextra options: '%s'" % (' '.join(extra_postargs)) log.info(display) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) finally: # register being done processing with _global_lock: _processing_files.remove(obj) if isinstance(self, FCompiler): objects_to_build = list(build.keys()) f77_objects, other_objects = [], [] for obj in objects: if obj in objects_to_build: src, ext = build[obj] if self.compiler_type=='absoft': obj = cyg2win32(obj) src = cyg2win32(src) if is_f_file(src) and not has_f90_header(src): f77_objects.append((obj, (src, ext))) else: other_objects.append((obj, (src, ext))) # f77 objects can be built in parallel build_items = f77_objects # build f90 modules serial, module files are generated during # compilation and may be used by files later in the list so the # ordering is important for o in other_objects: single_compile(o) else: build_items = build.items() if len(build) > 1 and jobs > 1: # build parallel import multiprocessing.pool pool = multiprocessing.pool.ThreadPool(jobs) pool.map(single_compile, build_items) pool.close() else: # build serial for o in build_items: > single_compile(o) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ('/tmp/tmpraat08vq/tmp/tmpx_97m4xm.o', ('/tmp/tmpx_97m4xm.cpp', '.cpp')) def single_compile(args): obj, (src, ext) = args if not _needs_build(obj, cc_args, extra_postargs, pp_opts): return # check if we are currently already processing the same object # happens when using the same source in multiple extensions while True: # need explicit lock as there is no atomic check and add with GIL with _global_lock: # file not being worked on, start working if obj not in _processing_files: _processing_files.add(obj) break # wait for the processing to end time.sleep(0.1) try: # retrieve slot from our #job semaphore and build with _job_semaphore: > self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('/tmp/tmpraat08vq/tmp/tmpx_97m4xm.o', '/tmp/tmpx_97m4xm.cpp', '.cpp', ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '...builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...]) kw = {} > m = lambda self, *args, **kw: func(self, *args, **kw) /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = obj = '/tmp/tmpraat08vq/tmp/tmpx_97m4xm.o', src = '/tmp/tmpx_97m4xm.cpp' ext = '.cpp' cc_args = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] extra_postargs = ['-std=c++11', '-fno-math-errno', '-fvisibility=hidden', '-fno-wrapv', '-Wno-unused-function', '-Wno-int-in-bool-context', ...] pp_opts = ['-DENABLE_PYTHON_MODULE', '-D__PYTHRAN__=3', '-DPYTHRAN_BLAS_OPENBLAS', '-I/usr/include/flexiblas', '-I/builddir/build/BUILD/pythran-feature-0.11.0/pythran', '-I/usr/lib64/python3.10/site-packages/numpy/core/include', ...] def UnixCCompiler__compile(self, obj, src, ext, cc_args, extra_postargs, pp_opts): """Compile a single source files with a Unix-style compiler.""" # HP ad-hoc fix, see ticket 1383 ccomp = self.compiler_so if ccomp[0] == 'aCC': # remove flags that will trigger ANSI-C mode for aCC if '-Ae' in ccomp: ccomp.remove('-Ae') if '-Aa' in ccomp: ccomp.remove('-Aa') # add flags for (almost) sane C++ handling ccomp += ['-AA'] self.compiler_so = ccomp # ensure OPT environment variable is read if 'OPT' in os.environ: # XXX who uses this? from sysconfig import get_config_vars opt = " ".join(os.environ['OPT'].split()) gcv_opt = " ".join(get_config_vars('OPT')[0].split()) ccomp_s = " ".join(self.compiler_so) if opt not in ccomp_s: ccomp_s = ccomp_s.replace(gcv_opt, opt) self.compiler_so = ccomp_s.split() llink_s = " ".join(self.linker_so) if opt not in llink_s: self.linker_so = llink_s.split() + opt.split() display = '%s: %s' % (os.path.basename(self.compiler_so[0]), src) # gcc style automatic dependencies, outputs a makefile (-MF) that lists # all headers needed by a c file as a side effect of compilation (-MMD) if getattr(self, '_auto_depends', False): deps = ['-MMD', '-MF', obj + '.d'] else: deps = [] try: self.spawn(self.compiler_so + cc_args + [src, '-o', obj] + deps + extra_postargs, display = display) except DistutilsExecError as e: msg = str(e) > raise CompileError(msg) from None E distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpx_97m4xm.cpp -o /tmp/tmpraat08vq/tmp/tmpx_97m4xm.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/site-packages/numpy/distutils/unixccompiler.py:57: CompileError During handling of the above exception, another exception occurred: module_name = '_rbfinterp_norun00', cxxfile = '/tmp/tmpx_97m4xm.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppsamfwh0', buildtmp = '/tmp/tmpraat08vq' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: > setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) pythran/toolchain.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attr = {'cmdclass': {'build_ext': }, 'ext_modules': [, 'build': , 'build_ext': , ...} new_attr = {'cmdclass': {'bdist_rpm': , 'build': , 'ext_modules': [], 'headers': [], ...} def setup(**attr): cmdclass = numpy_cmdclass.copy() new_attr = attr.copy() if 'cmdclass' in new_attr: cmdclass.update(new_attr['cmdclass']) new_attr['cmdclass'] = cmdclass if 'configuration' in new_attr: # To avoid calling configuration if there are any errors # or help request in command in the line. configuration = new_attr.pop('configuration') old_dist = distutils.core._setup_distribution old_stop = distutils.core._setup_stop_after distutils.core._setup_distribution = None distutils.core._setup_stop_after = "commandline" try: dist = setup(**new_attr) finally: distutils.core._setup_distribution = old_dist distutils.core._setup_stop_after = old_stop if dist.help or not _command_line_ok(): # probably displayed help, skip running any commands return dist # create setup dictionary and append to new_attr config = configuration() if hasattr(config, 'todict'): config = config.todict() _dict_append(new_attr, **config) # Move extension source libraries to libraries libraries = [] for ext in new_attr.get('ext_modules', []): new_libraries = [] for item in ext.libraries: if is_sequence(item): lib_name, build_info = item _check_append_ext_library(libraries, lib_name, build_info) new_libraries.append(lib_name) elif is_string(item): new_libraries.append(item) else: raise TypeError("invalid description of extension module " "library %r" % (item,)) ext.libraries = new_libraries if libraries: if 'libraries' not in new_attr: new_attr['libraries'] = [] for item in libraries: _check_append_library(new_attr['libraries'], item) # sources in ext_modules or libraries may contain header files if ('ext_modules' in new_attr or 'libraries' in new_attr) \ and 'headers' not in new_attr: new_attr['headers'] = [] # Use our custom NumpyDistribution class instead of distutils' one new_attr['distclass'] = NumpyDistribution > return old_setup(**new_attr) /usr/lib64/python3.10/site-packages/numpy/distutils/core.py:169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attrs = {'cmdclass': {'bdist_rpm': , 'build': ], 'headers': [], 'name': '_rbfinterp_norun00', ...} klass = dist = ok = True def setup (**attrs): """The gateway to the Distutils: do everything your setup script needs to do, in a highly flexible and user-driven way. Briefly: create a Distribution instance; find and parse config files; parse the command line; run each Distutils command found there, customized by the options supplied to 'setup()' (as keyword arguments), in config files, and on the command line. The Distribution instance might be an instance of a class supplied via the 'distclass' keyword argument to 'setup'; if no such class is supplied, then the Distribution class (in dist.py) is instantiated. All other arguments to 'setup' (except for 'cmdclass') are used to set attributes of the Distribution instance. The 'cmdclass' argument, if supplied, is a dictionary mapping command names to command classes. Each command encountered on the command line will be turned into a command class, which is in turn instantiated; any class found in 'cmdclass' is used in place of the default, which is (for command 'foo_bar') class 'foo_bar' in module 'distutils.command.foo_bar'. The command class must provide a 'user_options' attribute which is a list of option specifiers for 'distutils.fancy_getopt'. Any command-line options between the current and the next command are used to set attributes of the current command object. When the entire command-line has been successfully parsed, calls the 'run()' method on each command object in turn. This method will be driven entirely by the Distribution object (which each command object has a reference to, thanks to its constructor), and the command-specific options that became attributes of each command object. """ global _setup_stop_after, _setup_distribution # Determine the distribution class -- either caller-supplied or # our Distribution (see below). klass = attrs.get('distclass') if klass: del attrs['distclass'] else: klass = Distribution if 'script_name' not in attrs: attrs['script_name'] = os.path.basename(sys.argv[0]) if 'script_args' not in attrs: attrs['script_args'] = sys.argv[1:] # Create the Distribution instance, using the remaining arguments # (ie. everything except distclass) to initialize it try: _setup_distribution = dist = klass(attrs) except DistutilsSetupError as msg: if 'name' not in attrs: raise SystemExit("error in setup command: %s" % msg) else: raise SystemExit("error in %s setup command: %s" % \ (attrs['name'], msg)) if _setup_stop_after == "init": return dist # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. dist.parse_config_files() if DEBUG: print("options (after parsing config files):") dist.dump_option_dicts() if _setup_stop_after == "config": return dist # Parse the command line and override config files; any # command-line errors are the end user's fault, so turn them into # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) if DEBUG: print("options (after parsing command line):") dist.dump_option_dicts() if _setup_stop_after == "commandline": return dist # And finally, run all the commands found on the command line. if ok: try: dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") except OSError as exc: if DEBUG: sys.stderr.write("error: %s\n" % (exc,)) raise else: raise SystemExit("error: %s" % (exc,)) except (DistutilsError, CCompilerError) as msg: if DEBUG: raise else: > raise SystemExit("error: " + str(msg)) E SystemExit: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpx_97m4xm.cpp -o /tmp/tmpraat08vq/tmp/tmpx_97m4xm.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 /usr/lib64/python3.10/distutils/core.py:163: SystemExit During handling of the above exception, another exception occurred: self = def __call__(self): if "unittest.skip" in self.module_code: return self.test_env.skipTest("Marked as skippable") if "unittest.python3.skip" in self.module_code: return self.test_env.skipTest("Marked as skippable") # resolve import locally to where the tests are located sys.path.insert(0, self.test_env.path) > self.test_env.run_test_case(self.module_code, self.module_name, self.runas, module_dir=self.module_dir, **self.specs) pythran/tests/__init__.py:470: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ pythran/tests/__init__.py:263: in run_test_case cxx_compiled = compile_pythrancode( pythran/toolchain.py:418: in compile_pythrancode output_file = compile_cxxcode(module_name, pythran/toolchain.py:355: in compile_cxxcode output_binary = compile_cxxfile(module_name, fdpath, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ module_name = '_rbfinterp_norun00', cxxfile = '/tmp/tmpx_97m4xm.cpp' output_binary = None kwargs = {'extra_compile_args': ['-O1', '-Wall', '-w', '-UNDEBUG', '-Wno-unused-function', '-Wno-int-in-bool-context', ...]} builddir = '/tmp/tmppsamfwh0', buildtmp = '/tmp/tmpraat08vq' extension = def compile_cxxfile(module_name, cxxfile, output_binary=None, **kwargs): '''c++ file -> native module Return the filename of the produced shared library Raises CompileError on failure ''' builddir = mkdtemp() buildtmp = mkdtemp() extension = PythranExtension(module_name, [cxxfile], **kwargs) try: setup(name=module_name, ext_modules=[extension], cmdclass={"build_ext": PythranBuildExt}, # fake CLI call script_name='setup.py', script_args=['--verbose' if logger.isEnabledFor(logging.INFO) else '--quiet', 'build_ext', '--build-lib', builddir, '--build-temp', buildtmp] ) except SystemExit as e: > raise CompileError(str(e)) E distutils.errors.CompileError: error: Command "gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC -DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c /tmp/tmpx_97m4xm.cpp -o /tmp/tmpraat08vq/tmp/tmpx_97m4xm.o -std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas" failed with exit status 1 pythran/toolchain.py:313: CompileError ----------------------------- Captured stdout call ----------------------------- running build_ext new_compiler returns building '_rbfinterp_norun00' extension C compiler: gcc -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fstack-protector-strong -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -D_GNU_SOURCE -fPIC -fwrapv -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mcpu=power8 -mtune=power8 -fasynchronous-unwind-tables -fstack-clash-protection -fPIC creating /tmp/tmpraat08vq/tmp compile options: '-DENABLE_PYTHON_MODULE -D__PYTHRAN__=3 -DPYTHRAN_BLAS_OPENBLAS -I/usr/include/flexiblas -I/builddir/build/BUILD/pythran-feature-0.11.0/pythran -I/usr/lib64/python3.10/site-packages/numpy/core/include -I/usr/include/python3.10 -c' extra options: '-std=c++11 -fno-math-errno -fvisibility=hidden -fno-wrapv -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -O1 -Wall -w -UNDEBUG -Wno-unused-function -Wno-int-in-bool-context -Wno-unknown-warning-option -Wno-unused-local-typedefs -Wno-absolute-value -Wno-missing-braces -Wno-unknown-pragmas' gcc: /tmp/tmpx_97m4xm.cpp ----------------------------- Captured stderr call ----------------------------- INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO: During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' cc1plus: warning: command-line option ‘-Wno-absolute-value’ is valid for C/ObjC but not for C++ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conj.hpp:5, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/conj.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/numpy/linalg/norm.hpp:5, from /tmp/tmpx_97m4xm.cpp:54: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp: In function ‘xsimd::batch, A> {anonymous}::pythonic::numpy::wrapper::conjugate(const xsimd::batch, A>&)’: /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/numpy/conjugate.hpp:25:21: error: ‘conj’ is not a member of ‘xsimd’; did you mean ‘std::conj’? 25 | return xsimd::conj(v); | ^~~~ In file included from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/traits.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/include/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/types/combined.hpp:4, from /builddir/build/BUILD/pythran-feature-0.11.0/pythran/pythonic/core.hpp:32, from /tmp/tmpx_97m4xm.cpp:1: /usr/include/c++/11/complex:1975:5: note: ‘std::conj’ declared here 1975 | conj(_Tp __x) | ^~~~ At global scope: cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics cc1plus: note: unrecognized command-line option ‘-Wno-unknown-warning-option’ may have been intended to silence earlier diagnostics WARNING: Compilation error, trying hard to find its origin... WARNING: Nop, I'm going to flood you with C++ errors! ------------------------------ Captured log call ------------------------------- INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' INFO pythran:constant_folding.py:93 During constant folding, bailing out due to: module 'builtins' has no attribute 'pythran' WARNING pythran:toolchain.py:423 Compilation error, trying hard to find its origin... WARNING pythran:toolchain.py:426 Nop, I'm going to flood you with C++ errors! =============================== warnings summary =============================== ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:8 ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:8 ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:8 ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:8 ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:8 ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:8 ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:8 ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:8 /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:8: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives from distutils import ccompiler ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:17 ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:17 ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:17 ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:17 ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:17 ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:17 ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:17 ../../../../usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:17 /usr/lib64/python3.10/site-packages/numpy/distutils/ccompiler.py:17: DeprecationWarning: The distutils.sysconfig module is deprecated, use sysconfig instead from distutils.sysconfig import customize_compiler pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 /builddir/build/BUILD/pythran-feature-0.11.0/pythran/tables.py:4530: DeprecationWarning: `np.bool` is a deprecated alias for the builtin `bool`. To silence this warning, use `bool` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.bool_` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations if not hasattr(numpy, method): pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 /builddir/build/BUILD/pythran-feature-0.11.0/pythran/tables.py:4530: DeprecationWarning: `np.complex` is a deprecated alias for the builtin `complex`. To silence this warning, use `complex` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.complex128` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations if not hasattr(numpy, method): pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 pythran/tables.py:4530 /builddir/build/BUILD/pythran-feature-0.11.0/pythran/tables.py:4530: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations if not hasattr(numpy, method): pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 /builddir/build/BUILD/pythran-feature-0.11.0/pythran/tables.py:4563: DeprecationWarning: `np.bool` is a deprecated alias for the builtin `bool`. To silence this warning, use `bool` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.bool_` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations obj = getattr(themodule, elem) pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 /builddir/build/BUILD/pythran-feature-0.11.0/pythran/tables.py:4563: DeprecationWarning: `np.complex` is a deprecated alias for the builtin `complex`. To silence this warning, use `complex` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.complex128` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations obj = getattr(themodule, elem) pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 pythran/tables.py:4563 /builddir/build/BUILD/pythran-feature-0.11.0/pythran/tables.py:4563: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations obj = getattr(themodule, elem) pythran/tests/__init__.py:72 pythran/tests/__init__.py:72 pythran/tests/__init__.py:72 pythran/tests/__init__.py:72 pythran/tests/__init__.py:72 pythran/tests/__init__.py:72 pythran/tests/__init__.py:72 pythran/tests/__init__.py:72 /builddir/build/BUILD/pythran-feature-0.11.0/pythran/tests/__init__.py:72: PytestUnknownMarkWarning: Unknown pytest.mark.module - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html module = pytest.mark.module :1 :1 :1 :1 :1 :1 :1 :1 :1: RuntimeWarning: invalid value encountered in arccosh pythran/tests/test_advanced.py: 9 warnings pythran/tests/test_base.py: 8 warnings pythran/tests/test_cases.py: 106 warnings pythran/tests/test_euler.py: 10 warnings pythran/tests/test_gwebb.py: 6 warnings pythran/tests/test_numpy_broadcasting.py: 1 warning pythran/tests/test_numpy_fft.py: 10 warnings pythran/tests/test_numpy_func2.py: 16 warnings pythran/tests/test_ndarray.py: 22 warnings pythran/tests/test_none.py: 147 warnings pythran/tests/test_numpy_func0.py: 16 warnings pythran/tests/test_numpy_func3.py: 43 warnings pythran/tests/test_numpy_random.py: 4 warnings pythran/tests/test_optimizations.py: 4 warnings pythran/tests/test_rosetta.py: 36 warnings pythran/tests/test_slice.py: 26 warnings pythran/tests/test_scipy.py: 15 warnings pythran/tests/test_str.py: 3 warnings pythran/tests/test_typing.py: 3 warnings /usr/lib64/python3.10/ast.py:410: DeprecationWarning: visit_NameConstant is deprecated; add visit_Constant return visitor(node) pythran/tests/test_cases.py::TestCases::test_d2q9_nxnyns_run0 :59: RuntimeWarning: overflow encountered in double_scalars pythran/tests/test_cases.py::TestCases::test_d2q9_nxnyns_run0 :60: RuntimeWarning: overflow encountered in double_scalars pythran/tests/test_cases.py::TestCases::test_d2q9_nxnyns_run0 :63: RuntimeWarning: overflow encountered in double_scalars pythran/tests/test_cases.py::TestCases::test_d2q9_nxnyns_run0 :15: RuntimeWarning: invalid value encountered in double_scalars pythran/tests/test_cases.py::TestCases::test_d2q9_nxnyns_run0 :17: RuntimeWarning: invalid value encountered in double_scalars pythran/tests/test_cases.py::TestCases::test_d2q9_nxnyns_run0 :19: RuntimeWarning: invalid value encountered in double_scalars pythran/tests/test_cases.py::TestCases::test_d2q9_nxnyns_run0 :63: RuntimeWarning: invalid value encountered in double_scalars pythran/tests/test_cases.py::TestCases::test_d2q9_nxnyns_run0 :64: RuntimeWarning: overflow encountered in double_scalars pythran/tests/test_cases.py::TestCases::test_d2q9_nxnyns_run0 :21: RuntimeWarning: invalid value encountered in double_scalars pythran/tests/test_cases.py::TestCases::test_d2q9_nxnyns_run0 :23: RuntimeWarning: invalid value encountered in double_scalars pythran/tests/test_cases.py::TestCases::test_d2q9_nxnyns_run0 :20: RuntimeWarning: invalid value encountered in double_scalars pythran/tests/test_cases.py::TestCases::test_d2q9_nxnyns_run0 :22: RuntimeWarning: invalid value encountered in double_scalars pythran/tests/test_cases.py::TestCases::test_log_likelihood_run0 :10: RuntimeWarning: divide by zero encountered in log pythran/tests/test_complex.py::TestComplex::test_complex256_array0 :1: RuntimeWarning: overflow encountered in cos pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_nanmax2 pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_nanmin2 :1: RuntimeWarning: All-NaN slice encountered pythran/tests/test_numpy_func1.py::TestNumpyFunc1::test_alen0 pythran/tests/test_numpy_func1.py::TestNumpyFunc1::test_alen1 <__array_function__ internals>:5: DeprecationWarning: `np.alen` is deprecated, use `len` instead pythran/tests/test_ndarray.py::TestNdarray::test_vexpr_of_texpr /builddir/build/BUILD/pythran-feature-0.11.0/pythran/tests/test_ndarray.py:1066: DeprecationWarning: `np.bool` is a deprecated alias for the builtin `bool`. To silence this warning, use `bool` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.bool_` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations vexpr_of_texpr=[NDArray[numpy.float32,:,:], NDArray[numpy.bool,:,:]]) pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_asscalar0 pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_asscalar1 :1: DeprecationWarning: np.asscalar(a) is deprecated since NumPy v1.16, use a.item() instead pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_convolve_9 /builddir/build/BUILD/pythran-feature-0.11.0/pythran/tests/test_numpy_func2.py:245: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations dtype = numpy.float pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_correlate_9 /builddir/build/BUILD/pythran-feature-0.11.0/pythran/tests/test_numpy_func2.py:177: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations dtype = numpy.float pythran/tests/test_ndarray.py::TestNdarray::test_float128_0 :1: RuntimeWarning: overflow encountered in longdouble_scalars pythran/tests/test_numpy_func3.py::TestNumpyFunc3::test_numpy_pow3 :1: RuntimeWarning: divide by zero encountered in power pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_arccosh_scalar_float :4: RuntimeWarning: invalid value encountered in arccosh pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_arctan_complex pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_arctan_matrix_complex :4: RuntimeWarning: divide by zero encountered in arctan pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_random_bytes1 :6: DeprecationWarning: The binary mode of fromstring is deprecated, as it behaves surprisingly on unicode inputs. Use frombuffer instead pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_arctanh_float pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_arctanh_matrix_float :4: RuntimeWarning: divide by zero encountered in arctanh pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_random_integers0 pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_random_integers0 pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_random_integers0 :5: DeprecationWarning: This function is deprecated. Please call randint(1, 9 + 1) instead pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_random_integers1 pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_random_integers1 pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_random_integers1 pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_random_integers2 pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_random_integers3 :5: DeprecationWarning: This function is deprecated. Please call randint(10, 20 + 1) instead pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_fromstring0 pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_fromstring1 :1: DeprecationWarning: The binary mode of fromstring is deprecated, as it behaves surprisingly on unicode inputs. Use frombuffer instead pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_fromstring1 :1: DeprecationWarning: The binary mode of fromstring is deprecated, as it behaves surprisingly on unicode inputs. Use frombuffer instead pythran/tests/test_random.py::TestRandom::test_shuffle2 pythran/tests/test_random.py::TestRandom::test_shuffle3 :5: DeprecationWarning: The *random* parameter to shuffle() has been deprecated since Python 3.9 and will be removed in a subsequent version. pythran/tests/test_spec_parser.py::TestSpecParser::test_middle_spec1 /builddir/build/BUILD/pythran-feature-0.11.0/pythran/tests/test_spec_parser.py:127: DeprecationWarning: Please use assertEqual instead. self.assertEquals(len(pythran.spec_parser(code).functions), 1) pythran/tests/test_spec_parser.py::TestSpecParser::test_middle_spec1 /builddir/build/BUILD/pythran-feature-0.11.0/pythran/tests/test_spec_parser.py:128: DeprecationWarning: Please use assertEqual instead. self.assertEquals(len(pythran.spec_parser(code).functions['zoo']), 2) -- Docs: https://docs.pytest.org/en/stable/warnings.html =========================== short test summary info ============================ FAILED pythran/tests/test_base.py::TestBase::test_complex_conj - distutils.er... FAILED pythran/tests/test_cases.py::TestCases::test_cdotc_run0 - distutils.er... FAILED pythran/tests/test_cases.py::TestCases::test_crotg_run0 - distutils.er... FAILED pythran/tests/test_cases.py::TestCases::test_cronbach_run0 - distutils... FAILED pythran/tests/test_cases.py::TestCases::test_matrix_class_distance_run0 FAILED pythran/tests/test_cases.py::TestCases::test_rand_mat_stat_norun0 - ... FAILED pythran/tests/test_complex.py::TestComplex::test_complex256_array4 - d... FAILED pythran/tests/test_complex.py::TestComplex::test_conjugate - distutils... FAILED pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_var3 - distuti... FAILED pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_var4 - distuti... FAILED pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_var5 - distuti... FAILED pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_var6 - distuti... FAILED pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_var7 - distuti... FAILED pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_var8 - distuti... FAILED pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_var9 - distuti... FAILED pythran/tests/test_numpy_func1.py::TestNumpyFunc1::test_transpose_expr2 FAILED pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_std0 - distuti... FAILED pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_std1 - distuti... FAILED pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_std2 - distuti... FAILED pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_std3 - distuti... FAILED pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_var0 - distuti... FAILED pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_var1 - distuti... FAILED pythran/tests/test_numpy_func0.py::TestNumpyFunc0::test_var2 - distuti... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_convolve_1 - d... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_convolve_10 - ... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_convolve_11 - ... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_convolve_2 - d... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_convolve_3 - d... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_convolve_4 - d... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_convolve_5 - d... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_convolve_6 - d... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_convolve_7 - d... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_convolve_8 - d... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_convolve_9 - d... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_correlate_1 - ... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_correlate_10 FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_correlate_11 FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_correlate_2 - ... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_correlate_3 - ... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_correlate_4 - ... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_correlate_5 - ... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_correlate_6 - ... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_correlate_7 - ... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_correlate_8 - ... FAILED pythran/tests/test_numpy_func2.py::TestNumpyFunc2::test_correlate_9 - ... FAILED pythran/tests/test_normalize_methods.py::TestNormalizeMethods::test_dispatch_conjugate FAILED pythran/tests/test_numpy_func3.py::TestNumpyFunc3::test_vdot0 - distut... FAILED pythran/tests/test_numpy_func3.py::TestNumpyFunc3::test_vdot1 - distut... FAILED pythran/tests/test_numpy_func3.py::TestNumpyFunc3::test_vdot2 - distut... FAILED pythran/tests/test_numpy_func3.py::TestNumpyFunc3::test_vdot3 - distut... FAILED pythran/tests/test_numpy_linalg.py::TestNumpyLinalg::test_linalg_norm0 FAILED pythran/tests/test_numpy_linalg.py::TestNumpyLinalg::test_linalg_norm1 FAILED pythran/tests/test_numpy_linalg.py::TestNumpyLinalg::test_linalg_norm2 FAILED pythran/tests/test_numpy_linalg.py::TestNumpyLinalg::test_linalg_norm3 FAILED pythran/tests/test_numpy_linalg.py::TestNumpyLinalg::test_linalg_norm4 FAILED pythran/tests/test_numpy_linalg.py::TestNumpyLinalg::test_linalg_norm5 FAILED pythran/tests/test_numpy_linalg.py::TestNumpyLinalg::test_linalg_norm6 FAILED pythran/tests/test_numpy_linalg.py::TestNumpyLinalg::test_linalg_norm7 FAILED pythran/tests/test_numpy_linalg.py::TestNumpyLinalg::test_linalg_norm8 FAILED pythran/tests/test_numpy_linalg.py::TestNumpyLinalg::test_linalg_norm_pydoc FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_binomial0 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_binomial1 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_binomial2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_chisquare0a FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_chisquare0b FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_chisquare2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_exponential0 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_exponential0a FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_exponential0b FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_laplace1 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_exponential1 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_exponential2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_laplace2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_logistic0 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_f0a - ... FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_rayleigh0 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_f0b - ... FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_logistic0a FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_rayleigh0a FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_f2 - d... FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_logistic0b FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_rayleigh0b FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_gamma0a FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_logistic1 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_rayleigh1 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_logistic2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_gamma0b FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_rayleigh2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_gamma2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_lognormal0 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_geometric0a FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_geometric0b FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_lognormal0a FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_geometric2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_standard_exponential0 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_lognormal0b FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_standard_exponential1 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_gumbel0 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_standard_exponential2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_lognormal1 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_standard_gamma0 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_gumbel0a FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_lognormal2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_standard_gamma1 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_gumbel0b FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_standard_gamma2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_logseries0 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_gumbel1 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_standard_normal0 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_logseries1 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_gumbel2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_standard_normal1 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_standard_normal2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_logseries2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_laplace0 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_uniform_no_arg FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_normal0 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_laplace0a FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_uniform_size_int FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_normal0a FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_laplace0b FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_uniform_size_tuple FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_normal0b FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_normal1 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_weibull0a FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_normal2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_weibull0b FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_pareto0a FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_weibull2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_pareto0b FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_pareto2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_poisson0 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_poisson0a FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_poisson0b FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_poisson1 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_poisson2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_power0a FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_power0b FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_power2 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_randn0 FAILED pythran/tests/test_numpy_random.py::TestNumpyRandom::test_numpy_randn1 FAILED pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conj_complex FAILED pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conj_float FAILED pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conjugate_float FAILED pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conj_matrix_complex FAILED pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conjugate_matrix_complex FAILED pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conj_matrix_float FAILED pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conjugate_matrix_float FAILED pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conj_scalar_complex FAILED pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conjugate_scalar_complex FAILED pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conjugate_scalar_float FAILED pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conj_scalar_float FAILED pythran/tests/test_numpy_ufunc_unary.py::TestNumpyUFuncUnary::test_numpy_ufunc_unary_numpy_ufunc_unary_numpy_conjugate_complex FAILED pythran/tests/test_scipy.py::TestScipy::test__calc_binned_statistic_norun0 FAILED pythran/tests/test_scipy.py::TestScipy::test__rbfinterp_norun0 - distu... === 155 failed, 3275 passed, 49 skipped, 611 warnings in 3884.50s (1:04:44) ==== RPM build errors: error: Bad exit status from /var/tmp/rpm-tmp.SWL4HE (%check) Macro expanded in comment on line 23: %{url}/archive/%{version}/%{name}-%{version}.tar.gz Bad exit status from /var/tmp/rpm-tmp.SWL4HE (%check) Child return code was: 1 EXCEPTION: [Error()] Traceback (most recent call last): File "/usr/lib/python3.9/site-packages/mockbuild/trace_decorator.py", line 93, in trace result = func(*args, **kw) File "/usr/lib/python3.9/site-packages/mockbuild/util.py", line 600, in do_with_status raise exception.Error("Command failed: \n # %s\n%s" % (command, output), child.returncode) mockbuild.exception.Error: Command failed: # bash --login -c /usr/bin/rpmbuild -ba --noprep --target ppc64le --nodeps /builddir/build/SPECS/pythran.spec