Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[JUJU-1172] More cpu and memory on the ephemeral node for pylibjuju tests #24

Merged
merged 1 commit into from
Jun 9, 2022

Conversation

cderici
Copy link

@cderici cderici commented Jun 8, 2022

Python-libjuju integration tests take more than 2 hours on the ephemeral-github-medium-amd64 (t3a.medium), so it fails with a
timeout, so we're moving to ephemeral-github-8c-32g-amd64 (m5a.2xlarge)

Side note; technically m5a.xlarge also works -doesn't timeout- but it takes too long (to wait for tests on a PR every time). The ideal is m5a.4xlarge, which I believe what was being used before the latest changes, but m5a.2xlarge is fine too (half the price 💰)

Python-libjuju integration tests take more than 2 hours on the
ephemeral-github-medium-amd64 (t3a.medium), so it fails with a
timeout, so we're moving to ephemeral-github-8c-32g-amd64
(m5a.2xlarge)
Copy link
Contributor

@juanmanuel-tirado juanmanuel-tirado left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LG2M

@cderici cderici merged commit ca28b57 into juju:main Jun 9, 2022
jujubot added a commit to juju/python-libjuju that referenced this pull request Jun 9, 2022
#681

#### Description

Looks like a unit test has slipped when we were landing #679 . This PR fixes that, along with other CI problems in pylibjuju.

#### QA Steps

CI tests should pass.

#### Notes & Discussion

This PR will act as a set of changes required to fix up the libjuju CI. There seems to be some configuration issues in juju-qa-jenkins too. 

~~This shouldn't land until we see all green in the CI.~~ We have some intermittent failures about internal event handling that will be addressed later.


- [x] So the time out we see in the integration tests are caused by the ephemeral node configurations on aws, the jobs spawn machines that are not powerful enough to run the tests within 120 mins (global timeout). juju/juju-qa-jenkins#24 will fix that.

- [x] The `check-merge` job will be fixed when the [detect-merge-go-version](https://github.com/juju/juju-qa-jenkins/blob/0e87aff16e6e83908926292709b88cf6a153dd5f/jobs/github/github-check-merge.yml#L143) builder is excluded from the python-libjuju jobs.
@cderici cderici deleted the more-power-for-libjuju-tests branch August 24, 2022 17:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants