Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch to github actions #408

Merged
merged 47 commits into from
Jan 16, 2021
Merged
Show file tree
Hide file tree
Changes from 7 commits
Commits
Show all changes
47 commits
Select commit Hold shift + click to select a range
ce1fd84
Switch to github actions
albheim Dec 7, 2020
44ffd89
sudo for apt install
albheim Dec 7, 2020
3b85c37
Separate nightly and add badge
albheim Dec 8, 2020
f01144d
Remove doctest from nightly file
albheim Dec 8, 2020
70e6e1f
Move ci to same file and remove apt from test
albheim Dec 8, 2020
af814dd
Remove nightly file
albheim Dec 8, 2020
4c5fd9b
Update job names
albheim Dec 8, 2020
ccd85d1
Split test and docs in two files
albheim Dec 8, 2020
7314ee7
Add documenter key to tagbot
albheim Dec 8, 2020
6e63f29
Merge branch 'master' into github_actions
albheim Dec 8, 2020
77de81e
Combine versions, run coverage on v1
albheim Dec 9, 2020
30fe8f6
Quote v1.0 so it is not same as v1
albheim Dec 9, 2020
d4d25bb
Testing doc commands
albheim Dec 9, 2020
ee76475
move doctest to test
albheim Dec 9, 2020
880f699
add apt needed for doctest to test
albheim Dec 9, 2020
bffce77
change test jobname, try to fix weird bug
albheim Dec 9, 2020
ad622e2
move doctest to Docs
albheim Dec 9, 2020
b9ac0cf
Fix imports for docs workflow
albheim Dec 9, 2020
bd2f0d5
remove explicit doctest call, makedocs includes it
albheim Dec 9, 2020
4bd3459
Fix badges for test
albheim Dec 11, 2020
89b004e
rename workflows
albheim Dec 12, 2020
f7e3d05
doctest in runtest
albheim Dec 12, 2020
f27ae78
add apt install to ci test
albheim Dec 12, 2020
2cf7067
add timeout of 30 minutes
albheim Dec 13, 2020
1332438
Set StaticArrays compat to 1.0 and update doctest
albheim Dec 13, 2020
c1c8e60
Bind StaticArrays dep to 0.12 for julia 1.0
albheim Dec 13, 2020
8a0403d
Update test badge to point to correct workflow
albheim Dec 13, 2020
1861156
Remove StaticArrays restriction, test on julia 1.3
albheim Dec 13, 2020
5680c1b
Move back to test 1.0 with StaticArrays 0.12
albheim Dec 13, 2020
7920d07
Remove unused DOCUMENTER_KEY
albheim Dec 13, 2020
1f55060
Fix badge link
albheim Dec 13, 2020
7d4bf93
Fix badge link (again...)
albheim Dec 13, 2020
f2bb4bd
Remove StaticArrays compat, move doctest
albheim Dec 15, 2020
c117d7a
Keep compat in docs
albheim Dec 15, 2020
1b5ef98
Remove extra citation
albheim Dec 15, 2020
cc67124
remove docs apt deps from ci
albheim Dec 15, 2020
d9b21b9
add envvars for plots to doctest
albheim Dec 15, 2020
4772cc9
Remove StaticArrays from test deps
albheim Dec 15, 2020
d49c1a2
Move ENV setup for docs
albheim Dec 15, 2020
d50784b
Revert "Move ENV setup for docs"
albheim Dec 15, 2020
cdbfe20
increase timeout limit for ci and docs
albheim Dec 15, 2020
4eb8a0d
Add badge for doc status
albheim Jan 1, 2021
f521f6c
Documentation -> Docs in actions
albheim Jan 2, 2021
53e82f9
Reduce CI timeout time
albheim Jan 2, 2021
8d677c6
add newline between badges
albheim Jan 2, 2021
cb960ef
Add strict check for docs, remove double doctest
albheim Jan 12, 2021
58d0c64
Run doctest twice until we figure out strict kw
albheim Jan 13, 2021
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
101 changes: 101 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
name: CI
on:
pull_request:
branches:
- master
push:
branches:
- master
tags:
- '*'
jobs:
tests-required: # Tests that needs to succeed
name: Julia ${{ matrix.version }} - ${{ matrix.os }} - ${{ matrix.arch }} -
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
version:
- '1.0' # Replace this with the minimum Julia version that your package supports. E.g. if your package requires Julia 1.5 or higher, change this to '1.5'.
- '1' # Leave this line unchanged. '1' will automatically expand to the latest stable 1.x release of Julia.
os:
- ubuntu-latest
arch:
- x64
steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@v1
with:
version: ${{ matrix.version }}
arch: ${{ matrix.arch }}
- uses: actions/cache@v1
env:
cache-name: cache-artifacts
with:
path: ~/.julia/artifacts
key: ${{ runner.os }}-test-${{ env.cache-name }}-${{ hashFiles('**/Project.toml') }}
restore-keys: |
${{ runner.os }}-test-${{ env.cache-name }}-
${{ runner.os }}-test-
${{ runner.os }}-
- uses: julia-actions/julia-buildpkg@v1
- uses: julia-actions/julia-runtest@v1
- uses: julia-actions/julia-processcoverage@v1
- uses: codecov/codecov-action@v1
with:
file: lcov.info
Copy link
Member Author

@albheim albheim Dec 8, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to run coverage for both 1.0 and 1? What does that even mean? Does it average over the two? Does it pick the one who reported first/last?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO coverage is probably only needed on one version, unless there starts to be a lot of version-specific code introduced (which I don't forsee being an issue in this package).

test-allowed-fail: # Tests that are allowed to fail
name: Julia ${{ matrix.version }} - ${{ matrix.os }} - ${{ matrix.arch }} - [allowed to fail] -
runs-on: ${{ matrix.os }}
continue-on-error: true
strategy:
fail-fast: false
matrix:
version:
- 'nightly'
os:
- ubuntu-latest
arch:
- x64
steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@v1
with:
version: ${{ matrix.version }}
arch: ${{ matrix.arch }}
- uses: actions/cache@v1
env:
cache-name: cache-artifacts
with:
path: ~/.julia/artifacts
key: ${{ runner.os }}-test-${{ env.cache-name }}-${{ hashFiles('**/Project.toml') }}
restore-keys: |
${{ runner.os }}-test-${{ env.cache-name }}-
${{ runner.os }}-test-
${{ runner.os }}-
- uses: julia-actions/julia-buildpkg@v1
- uses: julia-actions/julia-runtest@v1
docs:
name: Documentation
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@v1
with:
version: '1'
- name: Install apt deps
run: sudo apt-get install libgtk-3-dev dvipng texlive
- run: |
julia --project=docs -e '
using Pkg
Pkg.develop(PackageSpec(path=pwd()))
Pkg.instantiate()'
- run: |
julia --project=docs -e '
using Documenter: doctest
using ControlSystems
doctest(ControlSystems)'
- run: julia --project=docs docs/make.jl
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCUMENTER_KEY: ${{ secrets.DOCUMENTER_KEY }}
39 changes: 0 additions & 39 deletions .travis.yml

This file was deleted.

2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# ControlSystems.jl

[![Build Status](https://travis-ci.com/JuliaControl/ControlSystems.jl.svg?branch=master)](https://travis-ci.com/JuliaControl/ControlSystems.jl)
[![Build Status](https://github.com/JuliaControl/ControlSystems.jl/workflows/CI/badge.svg)](https://github.com/JuliaControl/ControlSystems.jl/actions)
[![PkgEval](https://juliaci.github.io/NanosoldierReports/pkgeval_badges/C/ControlSystems.svg)](https://juliaci.github.io/NanosoldierReports/pkgeval_badges/report.html)
[![codecov](https://codecov.io/gh/JuliaControl/ControlSystems.jl/branch/master/graph/badge.svg)](https://codecov.io/gh/JuliaControl/ControlSystems.jl)

Expand Down
2 changes: 1 addition & 1 deletion docs/make.jl
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Set plot globals
ENV["PLOTS_TEST"] = "true"
ENV["GKSwstype"] = "100"
ENV["GKSwstype"] = "nul"
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As suggested in #401 to remove dependency on xvfb. Did it here so I would not have to figure out how to get that working correctly in gh actions just to then have to undo it. If we encounter problem with generating images for the plots we can patch it later.


using Documenter, ControlSystems, Plots, LinearAlgebra, DSP
import GR # Bug with world age in Plots.jl, see https://github.com/JuliaPlots/Plots.jl/issues/1047
Expand Down
7 changes: 0 additions & 7 deletions test/runtests.jl
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
using ControlSystems
using Test, LinearAlgebra, Random
using Documenter # For doctests
import Base.isapprox # In framework and test_synthesis
import SparseArrays: sparse # In test_matrix_comps
import DSP: conv # In test_conversion and test_synthesis
Expand Down Expand Up @@ -39,10 +38,4 @@ my_tests = [
_t0 = time()
run_tests(my_tests)
println("Ran all code tests in $(round(time()-_t0, digits=2)) seconds")


println("Test Doctests")
_t0 = time()
doctest(ControlSystems)
println("Ran Doctests in $(round(time()-_t0, digits=2)) seconds")
Comment on lines -44 to -47
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems that not having doctest in with the other tests so that it is run when checking coverage reduce coverage quite a bit.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you pinpoint which parts are getting reduced coverage when the doctests are removed? I don't think we should be relying on the doctests for the testing coverage, since I believe doctests are more designed to ensure the documentation examples don't go stale rather than the code is functionally correct.

Copy link
Member

@mfalt mfalt Dec 8, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, but historically we have had problems with where there docs didn't build properly or were outdated without us noticing, that is why I added it to the tests as well. If GitHub actions allows us to visualize and stop a merge when doctest fails I have no problem removing them from the tests, if we also add tests to cover the same functionality.

Edit: A nice feature with having the doctest in tests if that wont forget to run the doctest when developing locally. I don't know about you, but I rarely run them manually when developing.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you pinpoint which parts are getting reduced coverage when the doctests are removed? I don't think we should be relying on the doctests for the testing coverage, since I believe doctests are more designed to ensure the documentation examples don't go stale rather than the code is functionally correct.

In the coverage report you can see exactly which lines lost coverage. Most of it seems to be from pid_design.jl and discrete.jl

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, but historically we have had problems with where there docs didn't build properly or were outdated without us noticing, that is why I added it to the tests as well. If GitHub actions allows us to visualize and stop a merge when doctest fails I have no problem removing them from the tests, if we also add tests to cover the same functionality.

We can have a step in the workflow that is running doctest and running make.jl for the docs, and we can have it be required I believe.

Edit: A nice feature with having the doctest in tests if that wont forget to run the doctest when developing locally. I don't know about you, but I rarely run them manually when developing.

I'd never run doctest locally if it was not in, though I'm not sure I'd want to most of the time. It feels like the tests are what I'd use to see if I'm close to something working, and the running the PR can catch if something broke with the docs, a little less time running locally for me 😄

But if you feel it's warranted to run doctest all the time that is also fine by me, you have more experience with the workflows for this package.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The integration of github actions is (non surprising) much better than it ever was for Travis, and a commit or PR will display the red cross instead of the green checkbox whenever any of the worflows failed. It should thus be fairly easy to detect the docs build failing even if it's not part of the tests. There are some noise left due to coverage sometimes reporting a failure for no good reason, but the docs failure remains easily visible on hover over the red cross.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We had some weird behaviour where makedocs in docs/make.jl failed the doctest (it runs doctest internally, and it printed the errors we had based on the version problems) but managed to do the build and the workflow counted as passed. This was not expected to me and is not documented as far as I can see in Documenter.jl, so for now I run doctest first by itself once and then run docs/make.jl where makedocs runs doctest a second time (this could be disabled for the makedocs).

end
2 changes: 1 addition & 1 deletion test/test_plots.jl
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Set plot globals
ENV["PLOTS_TEST"] = "true"
ENV["GKSwstype"] = "100"
ENV["GKSwstype"] = "nul"

using Plots
gr()
Expand Down