Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add integration testing with python from Debian, Ubuntu, Fedora, Gentoo, Arch Linux, OpenSUSE, Slackware #2923

Merged
merged 3 commits into from
Feb 4, 2022

Conversation

mkoeppe
Copy link
Contributor

@mkoeppe mkoeppe commented Dec 11, 2021

Summary of changes

Closes

Pull Request Checklist

@jaraco
Copy link
Member

jaraco commented Jan 30, 2022

It looks like some tests are unsuccessful. Are these tests going to be flaky? Will you be available to help maintain these checks if included?

@mkoeppe
Copy link
Contributor Author

mkoeppe commented Jan 31, 2022

The failure in opensuse-tumbleweed (https://github.com/pypa/setuptools/runs/4494440976?check_suite_focus=true) looks like a legitimate incompatibility of setuptools with their Python to me.

I'll revise the list of platforms, adding some newer ones and removing flaky ones.

I maintain these things in the Sage project. Occasionally the list of platforms needs to be updated, and I'll be happy to send a PR, but it should otherwise not require much maintenance.

@jaraco jaraco merged commit f1732ac into pypa:main Feb 4, 2022
@abravalheri
Copy link
Contributor

Hi @mkoeppe, thank you very much for working in improving the tests. It is an extremely important (and difficult) thing to do!

I noticed that the introduced workflow is failing recently with some weird error messages:

The short ebuild name "tox" is ambiguous. Please specify

/sage/build/bin/sage-bootstrap-python: error: none of python python3 python3.10 python3.9 python3.8 python3.7 python2.7 python3.6 python2 is a suitable Python

Bootstrap failed. Either upgrade autotools and gettext; or run bootstrap with the -d option to download the auto-generated files instead.

The intriguing part is that this happens for a PR that has no code changes, it just modifies the configuration for pytest: #3080 (i.e., it expands the amount of warnings being ignored, it is not a big configuration change either...)

Is there any chance you have ever seen these error messages before or have any information about these failures? I am not very familiar with the sage tools and the CI steps you have set up, so I am kind of lost 😅 on what can be done to fix these problems. Any help would be very appreciated.

@mkoeppe
Copy link
Contributor Author

mkoeppe commented Feb 4, 2022

The Sage portability CI provisions installations of system packages based on its database that maps packages to their names in various distributions. The failure for gentoo-python3.9-standard comes from a flaw in this mapping, for which we already have a fix in https://trac.sagemath.org/ticket/33269; once this is merged into our develop branch, this will no longer fail in this GH Actions workflow.

It appears that centos-8 can no longer be tested (it is past EOL and it looks from the log like the packages are now no longer available).
This distribution name can just be removed from ci-sage.yml. I can send a PR if nobody beats me to it.

As replacements we have centos-stream-8, centos-stream-9 (https://trac.sagemath.org/ticket/33196), which can be added as soon as this is merged into our develop branch.

@mkoeppe
Copy link
Contributor Author

mkoeppe commented Feb 4, 2022

OK, I have opened #3081 for the update

@mkoeppe mkoeppe deleted the ci_sage branch February 4, 2022 19:58
@abravalheri
Copy link
Contributor

Thank you very much @mkoeppe for having a look on this and providing a PR!

@di
Copy link
Member

di commented Feb 11, 2022

@jaraco I think the large # of jobs this introduces to setuptools' GitHub workflows is slowing down builds for other PyPA projects. There is a limit of 20 concurrent jobs org-wide for our plan, and setuptools has gone from 36 jobs in #3078 to 119 jobs here. I've noticed a significant backlog over at https://github.com/pypa/warehouse/actions over the last few days and suspect this might be why.

Any thoughts on how we could get this down to a more reasonable number of concurrent jobs per workflow run?

(cc @ewdurbin)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants