Skip to content

Commit

Permalink
Merge pull request #5 from AdamRJensen/main
Browse files Browse the repository at this point in the history
Main
  • Loading branch information
AdamRJensen authored Oct 19, 2024
2 parents 483db92 + b533057 commit 7183b0d
Show file tree
Hide file tree
Showing 299 changed files with 121,869 additions and 89,380 deletions.
4 changes: 2 additions & 2 deletions .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@
- [ ] Closes #xxxx
- [ ] I am familiar with the [contributing guidelines](https://pvlib-python.readthedocs.io/en/latest/contributing.html)
- [ ] Tests added
- [ ] Updates entries in [`docs/sphinx/source/reference`](https://github.com/pvlib/pvlib-python/blob/master/docs/sphinx/source/reference) for API changes.
- [ ] Adds description and name entries in the appropriate "what's new" file in [`docs/sphinx/source/whatsnew`](https://github.com/pvlib/pvlib-python/tree/master/docs/sphinx/source/whatsnew) for all changes. Includes link to the GitHub Issue with `` :issue:`num` `` or this Pull Request with `` :pull:`num` ``. Includes contributor name and/or GitHub username (link with `` :ghuser:`user` ``).
- [ ] Updates entries in [`docs/sphinx/source/reference`](https://github.com/pvlib/pvlib-python/blob/main/docs/sphinx/source/reference) for API changes.
- [ ] Adds description and name entries in the appropriate "what's new" file in [`docs/sphinx/source/whatsnew`](https://github.com/pvlib/pvlib-python/tree/main/docs/sphinx/source/whatsnew) for all changes. Includes link to the GitHub Issue with `` :issue:`num` `` or this Pull Request with `` :pull:`num` ``. Includes contributor name and/or GitHub username (link with `` :ghuser:`user` ``).
- [ ] New code is fully documented. Includes [numpydoc](https://numpydoc.readthedocs.io/en/latest/format.html) compliant docstrings, examples, and comments where necessary.
- [ ] Pull request is nearly complete and ready for detailed review.
- [ ] Maintainer: Appropriate GitHub Labels (including `remote-data`) and Milestone are assigned to the Pull Request and linked Issue.
Expand Down
33 changes: 23 additions & 10 deletions .github/workflows/asv_check.yml
Original file line number Diff line number Diff line change
@@ -1,34 +1,47 @@
name: asv

# CI ASV CHECK is aimed to verify that the benchmarks execute without error.
on: [pull_request, push]
on:
push:
branches:
- main
pull_request:


jobs:
quick:
quick-benchmarks:
runs-on: ubuntu-latest
defaults:
run:
shell: bash -el {0}

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
fetch-depth: 0

- name: Install Python
uses: actions/setup-python@v3
uses: actions/setup-python@v5
with:
python-version: '3.9.7'
python-version: '3.9'

- name: Install asv
run: pip install asv==0.4.2

# asv 0.4.2 (and more recent versions as well) creates conda envs
# using the --force option, which was removed in conda 24.3.
# Since ubuntu-latest now comes with conda 24.3 pre-installed,
# using the system's conda will result in error.
# To prevent that, we install an older version.
# TODO: remove this when we eventually upgrade our asv version.
# https://github.com/airspeed-velocity/asv/issues/1396
- name: Install Conda
uses: conda-incubator/setup-miniconda@v3
with:
conda-version: 24.1.2

- name: Run asv benchmarks
run: |
cd benchmarks
asv machine --yes
asv run HEAD^! --quick --dry-run --show-stderr | sed "/failed$/ s/^/##[error]/" | tee benchmarks.log
if grep "failed" benchmarks.log > /dev/null ; then
exit 1
fi
asv run HEAD^! --quick --dry-run --show-stderr
17 changes: 17 additions & 0 deletions .github/workflows/flake8-linter-matcher.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
{
"problemMatcher": [
{
"owner": "flake8-linter-error",
"severity": "error",
"pattern": [
{
"regexp": "^([^:]+):(\\d+):(\\d+):\\s+([EWCNF]\\d+\\s+.+)$",
"file": 1,
"line": 2,
"column": 3,
"message": 4
}
]
}
]
}
27 changes: 27 additions & 0 deletions .github/workflows/flake8.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
name: Python Flake8 Linter
on:
pull_request:
jobs:
flake8-linter:
runs-on: ubuntu-latest
steps:
- name: Checkout source
uses: actions/checkout@v4
- name: Install Python 3.11
uses: actions/setup-python@v5
with:
python-version: '3.11'
- name: Install Flake8 5.0.4 linter
run: pip install flake8==5.0.4 # use this version for --diff option
- name: Setup Flake8 output matcher for PR annotations
run: echo '::add-matcher::.github/workflows/flake8-linter-matcher.json'
- name: Fetch pull request target branch
run: |
git remote add upstream https://github.com/pvlib/pvlib-python.git
git fetch upstream $GITHUB_BASE_REF
- name: Run Flake8 linter
run: git diff upstream/$GITHUB_BASE_REF HEAD -- "*.py" | flake8
--exclude pvlib/version.py
--ignore E201,E241,E226,W503,W504
--max-line-length 79
--diff
14 changes: 9 additions & 5 deletions .github/workflows/publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ on:
pull_request:
push:
branches:
- master
- main
tags:
- "v*"

Expand All @@ -15,27 +15,31 @@ jobs:
runs-on: ubuntu-latest
steps:
# fetch all commits and tags so versioneer works
- uses: actions/checkout@v2
- uses: actions/checkout@v4
with:
fetch-depth: 0

- name: Set up Python
uses: actions/setup-python@v2
uses: actions/setup-python@v5
with:
python-version: 3.8
python-version: 3.9

- name: Install build tools
run: |
python -m pip install --upgrade pip
python -m pip install build
python -m pip install twine
- name: Build packages
run: python -m build

- name: Check metadata verification
run: python -m twine check --strict dist/*

# only publish distribution to PyPI for tagged commits
- name: Publish distribution to PyPI
if: startsWith(github.ref, 'refs/tags/v')
uses: pypa/gh-action-pypi-publish@release/v1
with:
user: __token__
password: ${{ secrets.pypi_password }}
password: ${{ secrets.pypi_password }}
24 changes: 13 additions & 11 deletions .github/workflows/pytest-remote-data.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# A secondary test job that only runs the iotools tests if explicitly requested
# (for pull requests) or on a push to the master branch.
# (for pull requests) or on a push to the main branch.
# Because the iotools tests require GitHub secrets, we need to be careful about
# malicious PRs accessing the secrets and exposing them externally.
#
Expand Down Expand Up @@ -47,36 +47,36 @@ on:
pull_request_target:
push:
branches:
- master
- main

jobs:
test:

strategy:
fail-fast: false # don't cancel other matrix jobs when one fails
matrix:
python-version: [3.7, 3.8, 3.9, "3.10"]
python-version: [3.9, "3.10", "3.11", "3.12"]
suffix: [''] # the alternative to "-min"
include:
- python-version: 3.7
- python-version: 3.9
suffix: -min

runs-on: ubuntu-latest
if: (github.event_name == 'pull_request_target' && contains(github.event.pull_request.labels.*.name, 'remote-data')) || (github.event_name == 'push')

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
if: github.event_name == 'pull_request_target'
# pull_request_target runs in the context of the target branch (pvlib/master),
# pull_request_target runs in the context of the target branch (pvlib/main),
# but what we need is the hypothetical merge commit from the PR:
with:
ref: "refs/pull/${{ github.event.number }}/merge"

- uses: actions/checkout@v2
- uses: actions/checkout@v4
if: github.event_name == 'push'

- name: Set up conda environment
uses: conda-incubator/setup-miniconda@v2
uses: conda-incubator/setup-miniconda@v3
with:
activate-environment: test_env
environment-file: ${{ env.REQUIREMENTS }}
Expand All @@ -99,12 +99,14 @@ jobs:
SOLARANYWHERE_API_KEY: ${{ secrets.SOLARANYWHERE_API_KEY }}
BSRN_FTP_USERNAME: ${{ secrets.BSRN_FTP_USERNAME }}
BSRN_FTP_PASSWORD: ${{ secrets.BSRN_FTP_PASSWORD }}
run: pytest pvlib/tests/iotools pvlib/tests/test_forecast.py --cov=./ --cov-report=xml --remote-data
run: pytest pvlib/tests/iotools --cov=./ --cov-report=xml --remote-data

- name: Upload coverage to Codecov
if: matrix.python-version == 3.7 && matrix.suffix == ''
uses: codecov/codecov-action@v2
if: matrix.python-version == 3.9 && matrix.suffix == ''
uses: codecov/codecov-action@v4
with:
fail_ci_if_error: true
verbose: true
flags: remote-data # flags are configured in codecov.yml
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
26 changes: 15 additions & 11 deletions .github/workflows/pytest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,20 +4,20 @@ on:
pull_request:
push:
branches:
- master
- main

jobs:
test:
strategy:
fail-fast: false # don't cancel other matrix jobs when one fails
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
python-version: [3.7, 3.8, 3.9, "3.10"]
python-version: [3.9, "3.10", "3.11", "3.12"]
environment-type: [conda, bare]
suffix: [''] # placeholder as an alternative to "-min"
include:
- os: ubuntu-latest
python-version: 3.7
python-version: 3.9
environment-type: conda
suffix: -min
exclude:
Expand All @@ -31,7 +31,7 @@ jobs:
steps:
# We check out only a limited depth and then pull tags to save time
- name: Checkout source
uses: actions/checkout@v3
uses: actions/checkout@v4
with:
fetch-depth: 100

Expand All @@ -40,12 +40,14 @@ jobs:

- name: Install Conda environment with Micromamba
if: matrix.environment-type == 'conda'
uses: mamba-org/provision-with-micromamba@v12
uses: mamba-org/setup-micromamba@v1
with:
environment-file: ${{ env.REQUIREMENTS }}
cache-downloads: true
extra-specs: |
create-args: >-
python=${{ matrix.python-version }}
condarc: |
channel-priority: flexible
env:
# build requirement filename. First replacement is for the python
# version, second is to add "-min" if needed
Expand All @@ -58,7 +60,7 @@ jobs:

- name: Install bare Python ${{ matrix.python-version }}${{ matrix.suffix }}
if: matrix.environment-type == 'bare'
uses: actions/setup-python@v1
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

Expand All @@ -76,13 +78,15 @@ jobs:
- name: Run tests
shell: bash -l {0} # necessary for conda env to be active
run: |
# ignore iotools & forecast; those tests are run in a separate workflow
pytest pvlib --cov=./ --cov-report=xml --ignore=pvlib/tests/iotools --ignore=pvlib/tests/test_forecast.py
# ignore iotools; those tests are run in a separate workflow
pytest pvlib --cov=./ --cov-report=xml --ignore=pvlib/tests/iotools
- name: Upload coverage to Codecov
if: matrix.python-version == 3.7 && matrix.suffix == '' && matrix.os == 'ubuntu-latest' && matrix.environment-type == 'conda'
uses: codecov/codecov-action@v2
if: matrix.python-version == 3.9 && matrix.suffix == '' && matrix.os == 'ubuntu-latest' && matrix.environment-type == 'conda'
uses: codecov/codecov-action@v4
with:
fail_ci_if_error: true
verbose: true
flags: core # flags are configured in codecov.yml
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
32 changes: 32 additions & 0 deletions .github/workflows/top_ranked_issue.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
name: top_ranked_issues

on:
pull_request: # for testings purposes, should be removed
schedule:
# Runs every day at 3:00 AM UTC
- cron: '0 3 * * *'

jobs:
run-script:
runs-on: ubuntu-latest

# Define GitHub access token
env:
GITHUB_ACCESS_TOKEN: ${{ secrets.GITHUB_TOKEN }}


steps:
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.12"

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install github
# Run a sample script
- name: Run Script
run: |
python ./scripts/update_top_ranking_issues.py
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -40,8 +40,10 @@ pvlib/spa_c_files/spa_tester.c

# generated documentation
docs/sphinx/source/reference/generated
docs/sphinx/source/reference/*/generated
docs/sphinx/source/savefig
docs/sphinx/source/gallery
docs/sphinx/source/sg_execution_times.rst

# Installer logs
pip-log.txt
Expand Down
3 changes: 0 additions & 3 deletions .lgtm.yml

This file was deleted.

8 changes: 0 additions & 8 deletions .stickler.yml

This file was deleted.

Loading

0 comments on commit 7183b0d

Please sign in to comment.