Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch to github actions #408

Merged
merged 47 commits into from
Jan 16, 2021
Merged

Switch to github actions #408

merged 47 commits into from
Jan 16, 2021

Conversation

albheim
Copy link
Member

@albheim albheim commented Dec 7, 2020

Was interested in testing out github actions so just tried to replace travis with the github actions ci file suggested in discourse linked in top comment in #407

@JuliaControlBot
Copy link

This is an automated message.
Plots were compared to references. No changes were detected.

@codecov
Copy link

codecov bot commented Dec 7, 2020

Codecov Report

Merging #408 (58d0c64) into master (6c54a25) will decrease coverage by 5.29%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #408      +/-   ##
==========================================
- Coverage   82.40%   77.10%   -5.30%     
==========================================
  Files          31       31              
  Lines        2824     2835      +11     
==========================================
- Hits         2327     2186     -141     
- Misses        497      649     +152     
Impacted Files Coverage Δ
src/pid_design.jl 32.57% <0.00%> (-45.38%) ⬇️
src/discrete.jl 40.54% <0.00%> (-43.72%) ⬇️
src/types/SisoTf.jl 25.00% <0.00%> (-25.00%) ⬇️
src/types/SisoTfTypes/SisoZpk.jl 60.00% <0.00%> (-21.11%) ⬇️
src/freqresp.jl 88.00% <0.00%> (-3.90%) ⬇️
src/utilities.jl 80.21% <0.00%> (-2.58%) ⬇️
src/types/Lti.jl 54.76% <0.00%> (-2.39%) ⬇️
src/types/SisoTfTypes/SisoRational.jl 72.58% <0.00%> (-1.62%) ⬇️
src/types/StateSpace.jl 89.28% <0.00%> (-1.20%) ⬇️
src/plotting.jl 77.88% <0.00%> (-0.92%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 6c54a25...58d0c64. Read the comment docs.

Comment on lines -44 to -47
println("Test Doctests")
_t0 = time()
doctest(ControlSystems)
println("Ran Doctests in $(round(time()-_t0, digits=2)) seconds")
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems that not having doctest in with the other tests so that it is run when checking coverage reduce coverage quite a bit.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you pinpoint which parts are getting reduced coverage when the doctests are removed? I don't think we should be relying on the doctests for the testing coverage, since I believe doctests are more designed to ensure the documentation examples don't go stale rather than the code is functionally correct.

Copy link
Member

@mfalt mfalt Dec 8, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, but historically we have had problems with where there docs didn't build properly or were outdated without us noticing, that is why I added it to the tests as well. If GitHub actions allows us to visualize and stop a merge when doctest fails I have no problem removing them from the tests, if we also add tests to cover the same functionality.

Edit: A nice feature with having the doctest in tests if that wont forget to run the doctest when developing locally. I don't know about you, but I rarely run them manually when developing.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you pinpoint which parts are getting reduced coverage when the doctests are removed? I don't think we should be relying on the doctests for the testing coverage, since I believe doctests are more designed to ensure the documentation examples don't go stale rather than the code is functionally correct.

In the coverage report you can see exactly which lines lost coverage. Most of it seems to be from pid_design.jl and discrete.jl

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, but historically we have had problems with where there docs didn't build properly or were outdated without us noticing, that is why I added it to the tests as well. If GitHub actions allows us to visualize and stop a merge when doctest fails I have no problem removing them from the tests, if we also add tests to cover the same functionality.

We can have a step in the workflow that is running doctest and running make.jl for the docs, and we can have it be required I believe.

Edit: A nice feature with having the doctest in tests if that wont forget to run the doctest when developing locally. I don't know about you, but I rarely run them manually when developing.

I'd never run doctest locally if it was not in, though I'm not sure I'd want to most of the time. It feels like the tests are what I'd use to see if I'm close to something working, and the running the PR can catch if something broke with the docs, a little less time running locally for me 😄

But if you feel it's warranted to run doctest all the time that is also fine by me, you have more experience with the workflows for this package.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The integration of github actions is (non surprising) much better than it ever was for Travis, and a commit or PR will display the red cross instead of the green checkbox whenever any of the worflows failed. It should thus be fairly easy to detect the docs build failing even if it's not part of the tests. There are some noise left due to coverage sometimes reporting a failure for no good reason, but the docs failure remains easily visible on hover over the red cross.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We had some weird behaviour where makedocs in docs/make.jl failed the doctest (it runs doctest internally, and it printed the errors we had based on the version problems) but managed to do the build and the workflow counted as passed. This was not expected to me and is not documented as far as I can see in Documenter.jl, so for now I run doctest first by itself once and then run docs/make.jl where makedocs runs doctest a second time (this could be disabled for the makedocs).

steps:
- uses: actions/checkout@v2
- name: Install apt deps
run: sudo apt-get install libgtk-3-dev dvipng texlive
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are all these needed for both test and doctest?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess they are related to plots, which i think are run in both.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried to remove them from test, and it seems to have worked. So I'll leave them out for now at least, though doctest will still have them.

Comment on lines 46 to 48
- uses: codecov/codecov-action@v1
with:
file: lcov.info
Copy link
Member Author

@albheim albheim Dec 8, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to run coverage for both 1.0 and 1? What does that even mean? Does it average over the two? Does it pick the one who reported first/last?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO coverage is probably only needed on one version, unless there starts to be a lot of version-specific code introduced (which I don't forsee being an issue in this package).

@imciner2
Copy link
Contributor

imciner2 commented Dec 8, 2020

I think it would be nicer if the different CI parts (e.g. code tests vs. documentation generation) were split out into separate workflow files.

@baggepinnen
Copy link
Member

@albheim how's it going with this PR? It's still marked as WIP, is it getting close to merge? I have a huge backlog of commits to submit but right now there's no CI to show me how many bugs I've introduced :P

@albheim
Copy link
Member Author

albheim commented Dec 12, 2020

@albheim how's it going with this PR? It's still marked as WIP, is it getting close to merge? I have a huge backlog of commits to submit but right now there's no CI to show me how many bugs I've introduced :P

Ohh, yeah it is not really working as intended yet. Have not had time over the weekend. Might have some time tomorrow, or you are very welcome to take a look.

It seems like tests are working for 1 and 1.0 for now, but nightly tries for 6 hrs before stopping, have not looked at that yet but probably want to stop it earlier. Docs also have some problems in that it can fail doctest but still run without showing it, so was talking to Mattias about moving doctest into the normal test (as it has been before).

I'll push up the latest I had sitting around and if you want you can take a look, otherwise I'll try to take some time tomorrow.

@albheim
Copy link
Member Author

albheim commented Dec 14, 2020

I added this to #356 as yet another instance of time waste due to trying to support old julia versions

Agreed, I'm all for not spending too much time on keeping up support for old versions.

One thing here is that StaticArrays is only used on testing/doctesting so it seems like not too big a deal to keep it back solely based on ControlSystems, but if you want to use more packages than ControlSystems there might be other deps on StaticArrays which might then be held back which is not optimal.

@olof3
Copy link
Contributor

olof3 commented Dec 14, 2020

I added this to #356 as yet another instance of time waste due to trying to support old julia versions

I also think that the better option would be to drop the support for the older versions. especially given the time it takes to maintain. Even if it feels quite unsatisfactory. I guess there's not much to do about it.

We have some conditionals in the code depending on the version, if they makes things work better in old versions, I think we could keep them around for a bit longer, but if everything is completely broken anyway, we could get rid of them.

@baggepinnen
Copy link
Member

One thing here is that StaticArrays is only used on testing/doctesting so it seems like not too big a deal to keep it back solely based on ControlSystems, but if you want to use more packages than ControlSystems there might be other deps on StaticArrays which might then be held back which is not optimal.

This is the key, if we limit the compat, a user installing ControlSystems will not be able to use the new versions of StaticArrays. A pragmatic option would be to simply write other tests that do not use StaticArrays, but it's nice to have the tests as using StaitcArrays is the goto solution for increasing performance in a large number of cases.

@albheim
Copy link
Member Author

albheim commented Dec 15, 2020

Btw, when I moved the travis setup to github actions I dropped the xvfb dependancy since that seemed more annoying to set up and practically implemented what is suggested in #401. If we would like to save that for another time I can remove it from this PR, and otherwise we should remember to close that when merging this.

@albheim
Copy link
Member Author

albheim commented Dec 15, 2020

One thing here is that StaticArrays is only used on testing/doctesting so it seems like not too big a deal to keep it back solely based on ControlSystems, but if you want to use more packages than ControlSystems there might be other deps on StaticArrays which might then be held back which is not optimal.

This is the key, if we limit the compat, a user installing ControlSystems will not be able to use the new versions of StaticArrays. A pragmatic option would be to simply write other tests that do not use StaticArrays, but it's nice to have the tests as using StaitcArrays is the goto solution for increasing performance in a large number of cases.

After discussing with @mfalt I'm now running the doctest in the same workflow as building the docs (nicer anyway imo) so they no longer run with tests. This allows us to not set a compat in the main project.toml since we have the separate project.toml for docs. So I have locked StaticArrays to 0.12 in the docs project.toml which fixes the problem with doctest, while in the main project.toml it is not set to any compat.

What I also noticed is that we only seem to use it at all in doctest so maybe just remove it completely from the main project.toml?

@albheim
Copy link
Member Author

albheim commented Dec 17, 2020

After discussing with @mfalt I'm now running the doctest in the same workflow as building the docs (nicer anyway imo) so they no longer run with tests. This allows us to not set a compat in the main project.toml since we have the separate project.toml for docs. So I have locked StaticArrays to 0.12 in the docs project.toml which fixes the problem with doctest, while in the main project.toml it is not set to any compat.

With this said I still think we can up the supported version, but maybe leave that for #356 so we can get this PR merged?

@@ -1,9 +1,9 @@
# Set plot globals
ENV["PLOTS_TEST"] = "true"
ENV["GKSwstype"] = "100"
ENV["GKSwstype"] = "nul"
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As suggested in #401 to remove dependency on xvfb. Did it here so I would not have to figure out how to get that working correctly in gh actions just to then have to undo it. If we encounter problem with generating images for the plots we can patch it later.

@baggepinnen
Copy link
Member

Check seem to be passing, time to pull?

@albheim
Copy link
Member Author

albheim commented Jan 16, 2021

Check seem to be passing, time to pull?

I'm okay with the current state of it.

@baggepinnen
Copy link
Member

Alright, I'll pull and will deal with eventual consequences tomorrow 😊

@baggepinnen baggepinnen merged commit ca6cc80 into master Jan 16, 2021
@baggepinnen baggepinnen deleted the github_actions branch January 16, 2021 21:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants