Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch to github actions #408

Merged
merged 47 commits into from
Jan 16, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
47 commits
Select commit Hold shift + click to select a range
ce1fd84
Switch to github actions
albheim Dec 7, 2020
44ffd89
sudo for apt install
albheim Dec 7, 2020
3b85c37
Separate nightly and add badge
albheim Dec 8, 2020
f01144d
Remove doctest from nightly file
albheim Dec 8, 2020
70e6e1f
Move ci to same file and remove apt from test
albheim Dec 8, 2020
af814dd
Remove nightly file
albheim Dec 8, 2020
4c5fd9b
Update job names
albheim Dec 8, 2020
ccd85d1
Split test and docs in two files
albheim Dec 8, 2020
7314ee7
Add documenter key to tagbot
albheim Dec 8, 2020
6e63f29
Merge branch 'master' into github_actions
albheim Dec 8, 2020
77de81e
Combine versions, run coverage on v1
albheim Dec 9, 2020
30fe8f6
Quote v1.0 so it is not same as v1
albheim Dec 9, 2020
d4d25bb
Testing doc commands
albheim Dec 9, 2020
ee76475
move doctest to test
albheim Dec 9, 2020
880f699
add apt needed for doctest to test
albheim Dec 9, 2020
bffce77
change test jobname, try to fix weird bug
albheim Dec 9, 2020
ad622e2
move doctest to Docs
albheim Dec 9, 2020
b9ac0cf
Fix imports for docs workflow
albheim Dec 9, 2020
bd2f0d5
remove explicit doctest call, makedocs includes it
albheim Dec 9, 2020
4bd3459
Fix badges for test
albheim Dec 11, 2020
89b004e
rename workflows
albheim Dec 12, 2020
f7e3d05
doctest in runtest
albheim Dec 12, 2020
f27ae78
add apt install to ci test
albheim Dec 12, 2020
2cf7067
add timeout of 30 minutes
albheim Dec 13, 2020
1332438
Set StaticArrays compat to 1.0 and update doctest
albheim Dec 13, 2020
c1c8e60
Bind StaticArrays dep to 0.12 for julia 1.0
albheim Dec 13, 2020
8a0403d
Update test badge to point to correct workflow
albheim Dec 13, 2020
1861156
Remove StaticArrays restriction, test on julia 1.3
albheim Dec 13, 2020
5680c1b
Move back to test 1.0 with StaticArrays 0.12
albheim Dec 13, 2020
7920d07
Remove unused DOCUMENTER_KEY
albheim Dec 13, 2020
1f55060
Fix badge link
albheim Dec 13, 2020
7d4bf93
Fix badge link (again...)
albheim Dec 13, 2020
f2bb4bd
Remove StaticArrays compat, move doctest
albheim Dec 15, 2020
c117d7a
Keep compat in docs
albheim Dec 15, 2020
1b5ef98
Remove extra citation
albheim Dec 15, 2020
cc67124
remove docs apt deps from ci
albheim Dec 15, 2020
d9b21b9
add envvars for plots to doctest
albheim Dec 15, 2020
4772cc9
Remove StaticArrays from test deps
albheim Dec 15, 2020
d49c1a2
Move ENV setup for docs
albheim Dec 15, 2020
d50784b
Revert "Move ENV setup for docs"
albheim Dec 15, 2020
cdbfe20
increase timeout limit for ci and docs
albheim Dec 15, 2020
4eb8a0d
Add badge for doc status
albheim Jan 1, 2021
f521f6c
Documentation -> Docs in actions
albheim Jan 2, 2021
53e82f9
Reduce CI timeout time
albheim Jan 2, 2021
8d677c6
add newline between badges
albheim Jan 2, 2021
cb960ef
Add strict check for docs, remove double doctest
albheim Jan 12, 2021
58d0c64
Run doctest twice until we figure out strict kw
albheim Jan 13, 2021
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
58 changes: 58 additions & 0 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
name: Docs

on:
pull_request:
branches:
- master
push:
branches:
- master
tags:
- '*'

jobs:
docs:
name: Julia ${{ matrix.version }} - ${{ matrix.os }} - ${{ matrix.arch }} - ${{ github.event_name }}
runs-on: ${{ matrix.os }}
timeout-minutes: 30
strategy:
matrix:
version: [1]
os: [ubuntu-latest]
arch: [x64]
steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@v1
with:
version: ${{ matrix.version }}
arch: ${{ matrix.arch }}
- name: Install apt deps
run: sudo apt-get install libgtk-3-dev dvipng texlive
- uses: actions/cache@v1
env:
cache-name: cache-artifacts
with:
path: ~/.julia/artifacts
key: ${{ runner.os }}-test-${{ env.cache-name }}-${{ hashFiles('**/Project.toml') }}
restore-keys: |
${{ runner.os }}-test-${{ env.cache-name }}-
${{ runner.os }}-test-
${{ runner.os }}-
- name: Setup docs environment
run: |
julia --project=docs -e '
using Pkg
Pkg.develop(PackageSpec(path=pwd()))
Pkg.instantiate()'
- name: Test docs # Test are also run in the makedocs, but there they don't error properly unless we set strict in which case warnings also error
run: |
julia --project=docs -e '
ENV["PLOTS_TEST"] = "true"
ENV["GKSwstype"] = "nul"
using Documenter: doctest
using ControlSystems
doctest(ControlSystems)'
- name: Generate documentation
run: julia --project=docs --color=yes docs/make.jl
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
48 changes: 48 additions & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
name: CI

on:
pull_request:
branches:
- master
push:
branches:
- master
tags:
- '*'

jobs:
runtests:
name: Julia ${{ matrix.version }} - ${{ matrix.os }} - ${{ matrix.arch }} - ${{ github.event_name }}
runs-on: ${{ matrix.os }}
continue-on-error: ${{ matrix.version == 'nightly' }} # Allow nightly to fail and workflow still count as completed
timeout-minutes: 30
strategy:
fail-fast: false
matrix:
version: ['1.0', '1', 'nightly']
os: [ubuntu-latest]
arch: [x64]
steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@v1
with:
version: ${{ matrix.version }}
arch: ${{ matrix.arch }}
- uses: actions/cache@v1
env:
cache-name: cache-artifacts
with:
path: ~/.julia/artifacts
key: ${{ runner.os }}-test-${{ env.cache-name }}-${{ hashFiles('**/Project.toml') }}
restore-keys: |
${{ runner.os }}-test-${{ env.cache-name }}-
${{ runner.os }}-test-
${{ runner.os }}-
- uses: julia-actions/julia-buildpkg@v1
- uses: julia-actions/julia-runtest@v1
- uses: julia-actions/julia-processcoverage@v1
if: ${{ matrix.version == '1' }}
- uses: codecov/codecov-action@v1
if: ${{ matrix.version == '1' }}
with:
file: lcov.info
39 changes: 0 additions & 39 deletions .travis.yml

This file was deleted.

5 changes: 2 additions & 3 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,8 @@ julia = "1.0"

[extras]
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
GR = "28b8d3ca-fb5f-59d9-8090-bfdbd6d07a71"
StaticArrays = "90137ffa-7385-5640-81b9-e52037218182"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"

[targets]
test = ["Test", "Documenter", "GR", "StaticArrays"]
test = ["Test", "Documenter", "GR"]
4 changes: 3 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
# ControlSystems.jl

[![Build Status](https://travis-ci.com/JuliaControl/ControlSystems.jl.svg?branch=master)](https://travis-ci.com/JuliaControl/ControlSystems.jl)
[![Build Status](https://github.com/JuliaControl/ControlSystems.jl/workflows/CI/badge.svg)](https://github.com/JuliaControl/ControlSystems.jl/actions?query=workflow%3ACI)
[![Documentation Status](https://github.com/JuliaControl/ControlSystems.jl/workflows/Docs/badge.svg)](https://github.com/JuliaControl/ControlSystems.jl/actions?query=workflow%3ADocs)

[![PkgEval](https://juliaci.github.io/NanosoldierReports/pkgeval_badges/C/ControlSystems.svg)](https://juliaci.github.io/NanosoldierReports/pkgeval_badges/report.html)
[![codecov](https://codecov.io/gh/JuliaControl/ControlSystems.jl/branch/master/graph/badge.svg)](https://codecov.io/gh/JuliaControl/ControlSystems.jl)

Expand Down
3 changes: 3 additions & 0 deletions docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -6,3 +6,6 @@ GR = "28b8d3ca-fb5f-59d9-8090-bfdbd6d07a71"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80"
StaticArrays = "90137ffa-7385-5640-81b9-e52037218182"

[compat]
StaticArrays = "0.12"
7 changes: 4 additions & 3 deletions docs/make.jl
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
# Set plot globals
ENV["PLOTS_TEST"] = "true"
ENV["GKSwstype"] = "100"
ENV["GKSwstype"] = "nul"
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As suggested in #401 to remove dependency on xvfb. Did it here so I would not have to figure out how to get that working correctly in gh actions just to then have to undo it. If we encounter problem with generating images for the plots we can patch it later.


using Documenter, ControlSystems, Plots, LinearAlgebra, DSP
import GR # Bug with world age in Plots.jl, see https://github.com/JuliaPlots/Plots.jl/issues/1047
#import GR # Bug with world age in Plots.jl, see https://github.com/JuliaPlots/Plots.jl/issues/1047
gr()
default(show=false, size=(800,450))

Expand All @@ -30,6 +30,7 @@ println("Making docs")
makedocs(modules=[ControlSystems],
format=Documenter.HTML(),
sitename="ControlSystems.jl",
#strict=true,
pages=[
"Home" => "index.md",
"Guide" => Any[
Expand All @@ -47,6 +48,6 @@ makedocs(modules=[ControlSystems],
"Plotting" => "lib/plotting.md",
],
]
)
)

deploydocs(repo = "github.com/JuliaControl/ControlSystems.jl.git")
7 changes: 0 additions & 7 deletions test/runtests.jl
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
using ControlSystems
using Test, LinearAlgebra, Random
using Documenter # For doctests
import Base.isapprox # In framework and test_synthesis
import SparseArrays: sparse # In test_matrix_comps
import DSP: conv # In test_conversion and test_synthesis
Expand Down Expand Up @@ -39,10 +38,4 @@ my_tests = [
_t0 = time()
run_tests(my_tests)
println("Ran all code tests in $(round(time()-_t0, digits=2)) seconds")


println("Test Doctests")
_t0 = time()
doctest(ControlSystems)
println("Ran Doctests in $(round(time()-_t0, digits=2)) seconds")
Comment on lines -44 to -47
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems that not having doctest in with the other tests so that it is run when checking coverage reduce coverage quite a bit.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you pinpoint which parts are getting reduced coverage when the doctests are removed? I don't think we should be relying on the doctests for the testing coverage, since I believe doctests are more designed to ensure the documentation examples don't go stale rather than the code is functionally correct.

Copy link
Member

@mfalt mfalt Dec 8, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, but historically we have had problems with where there docs didn't build properly or were outdated without us noticing, that is why I added it to the tests as well. If GitHub actions allows us to visualize and stop a merge when doctest fails I have no problem removing them from the tests, if we also add tests to cover the same functionality.

Edit: A nice feature with having the doctest in tests if that wont forget to run the doctest when developing locally. I don't know about you, but I rarely run them manually when developing.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you pinpoint which parts are getting reduced coverage when the doctests are removed? I don't think we should be relying on the doctests for the testing coverage, since I believe doctests are more designed to ensure the documentation examples don't go stale rather than the code is functionally correct.

In the coverage report you can see exactly which lines lost coverage. Most of it seems to be from pid_design.jl and discrete.jl

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, but historically we have had problems with where there docs didn't build properly or were outdated without us noticing, that is why I added it to the tests as well. If GitHub actions allows us to visualize and stop a merge when doctest fails I have no problem removing them from the tests, if we also add tests to cover the same functionality.

We can have a step in the workflow that is running doctest and running make.jl for the docs, and we can have it be required I believe.

Edit: A nice feature with having the doctest in tests if that wont forget to run the doctest when developing locally. I don't know about you, but I rarely run them manually when developing.

I'd never run doctest locally if it was not in, though I'm not sure I'd want to most of the time. It feels like the tests are what I'd use to see if I'm close to something working, and the running the PR can catch if something broke with the docs, a little less time running locally for me 😄

But if you feel it's warranted to run doctest all the time that is also fine by me, you have more experience with the workflows for this package.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The integration of github actions is (non surprising) much better than it ever was for Travis, and a commit or PR will display the red cross instead of the green checkbox whenever any of the worflows failed. It should thus be fairly easy to detect the docs build failing even if it's not part of the tests. There are some noise left due to coverage sometimes reporting a failure for no good reason, but the docs failure remains easily visible on hover over the red cross.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We had some weird behaviour where makedocs in docs/make.jl failed the doctest (it runs doctest internally, and it printed the errors we had based on the version problems) but managed to do the build and the workflow counted as passed. This was not expected to me and is not documented as far as I can see in Documenter.jl, so for now I run doctest first by itself once and then run docs/make.jl where makedocs runs doctest a second time (this could be disabled for the makedocs).

end
2 changes: 1 addition & 1 deletion test/test_plots.jl
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Set plot globals
ENV["PLOTS_TEST"] = "true"
ENV["GKSwstype"] = "100"
ENV["GKSwstype"] = "nul"

using Plots
gr()
Expand Down