-
Notifications
You must be signed in to change notification settings - Fork 85
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Switch to github actions #408
Conversation
This is an automated message. |
Codecov Report
@@ Coverage Diff @@
## master #408 +/- ##
==========================================
- Coverage 82.40% 77.10% -5.30%
==========================================
Files 31 31
Lines 2824 2835 +11
==========================================
- Hits 2327 2186 -141
- Misses 497 649 +152
Continue to review full report at Codecov.
|
println("Test Doctests") | ||
_t0 = time() | ||
doctest(ControlSystems) | ||
println("Ran Doctests in $(round(time()-_t0, digits=2)) seconds") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems that not having doctest in with the other tests so that it is run when checking coverage reduce coverage quite a bit.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you pinpoint which parts are getting reduced coverage when the doctests are removed? I don't think we should be relying on the doctests for the testing coverage, since I believe doctests are more designed to ensure the documentation examples don't go stale rather than the code is functionally correct.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree, but historically we have had problems with where there docs didn't build properly or were outdated without us noticing, that is why I added it to the tests as well. If GitHub actions allows us to visualize and stop a merge when doctest fails I have no problem removing them from the tests, if we also add tests to cover the same functionality.
Edit: A nice feature with having the doctest in tests if that wont forget to run the doctest when developing locally. I don't know about you, but I rarely run them manually when developing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you pinpoint which parts are getting reduced coverage when the doctests are removed? I don't think we should be relying on the doctests for the testing coverage, since I believe doctests are more designed to ensure the documentation examples don't go stale rather than the code is functionally correct.
In the coverage report you can see exactly which lines lost coverage. Most of it seems to be from pid_design.jl and discrete.jl
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree, but historically we have had problems with where there docs didn't build properly or were outdated without us noticing, that is why I added it to the tests as well. If GitHub actions allows us to visualize and stop a merge when doctest fails I have no problem removing them from the tests, if we also add tests to cover the same functionality.
We can have a step in the workflow that is running doctest
and running make.jl
for the docs, and we can have it be required I believe.
Edit: A nice feature with having the doctest in tests if that wont forget to run the doctest when developing locally. I don't know about you, but I rarely run them manually when developing.
I'd never run doctest locally if it was not in, though I'm not sure I'd want to most of the time. It feels like the tests are what I'd use to see if I'm close to something working, and the running the PR can catch if something broke with the docs, a little less time running locally for me 😄
But if you feel it's warranted to run doctest all the time that is also fine by me, you have more experience with the workflows for this package.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The integration of github actions is (non surprising) much better than it ever was for Travis, and a commit or PR will display the red cross instead of the green checkbox whenever any of the worflows failed. It should thus be fairly easy to detect the docs build failing even if it's not part of the tests. There are some noise left due to coverage sometimes reporting a failure for no good reason, but the docs failure remains easily visible on hover over the red cross.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We had some weird behaviour where makedocs in docs/make.jl failed the doctest (it runs doctest internally, and it printed the errors we had based on the version problems) but managed to do the build and the workflow counted as passed. This was not expected to me and is not documented as far as I can see in Documenter.jl, so for now I run doctest first by itself once and then run docs/make.jl where makedocs runs doctest a second time (this could be disabled for the makedocs).
.github/workflows/ci.yml
Outdated
steps: | ||
- uses: actions/checkout@v2 | ||
- name: Install apt deps | ||
run: sudo apt-get install libgtk-3-dev dvipng texlive |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are all these needed for both test and doctest?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess they are related to plots, which i think are run in both.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried to remove them from test, and it seems to have worked. So I'll leave them out for now at least, though doctest will still have them.
.github/workflows/ci.yml
Outdated
- uses: codecov/codecov-action@v1 | ||
with: | ||
file: lcov.info |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we want to run coverage for both 1.0 and 1? What does that even mean? Does it average over the two? Does it pick the one who reported first/last?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IMO coverage is probably only needed on one version, unless there starts to be a lot of version-specific code introduced (which I don't forsee being an issue in this package).
I think it would be nicer if the different CI parts (e.g. code tests vs. documentation generation) were split out into separate workflow files. |
@albheim how's it going with this PR? It's still marked as WIP, is it getting close to merge? I have a huge backlog of commits to submit but right now there's no CI to show me how many bugs I've introduced :P |
Ohh, yeah it is not really working as intended yet. Have not had time over the weekend. Might have some time tomorrow, or you are very welcome to take a look. It seems like tests are working for 1 and 1.0 for now, but nightly tries for 6 hrs before stopping, have not looked at that yet but probably want to stop it earlier. Docs also have some problems in that it can fail doctest but still run without showing it, so was talking to Mattias about moving doctest into the normal test (as it has been before). I'll push up the latest I had sitting around and if you want you can take a look, otherwise I'll try to take some time tomorrow. |
Agreed, I'm all for not spending too much time on keeping up support for old versions. One thing here is that StaticArrays is only used on testing/doctesting so it seems like not too big a deal to keep it back solely based on ControlSystems, but if you want to use more packages than ControlSystems there might be other deps on StaticArrays which might then be held back which is not optimal. |
I also think that the better option would be to drop the support for the older versions. especially given the time it takes to maintain. Even if it feels quite unsatisfactory. I guess there's not much to do about it. We have some conditionals in the code depending on the version, if they makes things work better in old versions, I think we could keep them around for a bit longer, but if everything is completely broken anyway, we could get rid of them. |
This is the key, if we limit the compat, a user installing ControlSystems will not be able to use the new versions of StaticArrays. A pragmatic option would be to simply write other tests that do not use StaticArrays, but it's nice to have the tests as using StaitcArrays is the goto solution for increasing performance in a large number of cases. |
Btw, when I moved the travis setup to github actions I dropped the xvfb dependancy since that seemed more annoying to set up and practically implemented what is suggested in #401. If we would like to save that for another time I can remove it from this PR, and otherwise we should remember to close that when merging this. |
After discussing with @mfalt I'm now running the doctest in the same workflow as building the docs (nicer anyway imo) so they no longer run with tests. This allows us to not set a compat in the main project.toml since we have the separate project.toml for docs. So I have locked StaticArrays to 0.12 in the docs project.toml which fixes the problem with doctest, while in the main project.toml it is not set to any compat. What I also noticed is that we only seem to use it at all in doctest so maybe just remove it completely from the main project.toml? |
With this said I still think we can up the supported version, but maybe leave that for #356 so we can get this PR merged? |
@@ -1,9 +1,9 @@ | |||
# Set plot globals | |||
ENV["PLOTS_TEST"] = "true" | |||
ENV["GKSwstype"] = "100" | |||
ENV["GKSwstype"] = "nul" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As suggested in #401 to remove dependency on xvfb. Did it here so I would not have to figure out how to get that working correctly in gh actions just to then have to undo it. If we encounter problem with generating images for the plots we can patch it later.
Check seem to be passing, time to pull? |
I'm okay with the current state of it. |
Alright, I'll pull and will deal with eventual consequences tomorrow 😊 |
Was interested in testing out github actions so just tried to replace travis with the github actions ci file suggested in discourse linked in top comment in #407