Skip to content

updated fork #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 38 commits into
base: CI
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
a0812b1
CI asv check (#1454)
roger-lcc May 4, 2022
5047b26
Updated get_cams protocol to https #1457 (#1458)
PrajwalBorkar May 17, 2022
91986b9
remove ineffective assignment of `data_slope_nstd` (#1461)
hf-kklein Jun 13, 2022
c2fd03a
Added Benchmarking functions for scaling.py module (#1445)
Naman-Priyadarshi Jun 13, 2022
f928d39
Added Benchmarking functions for scaling.py module (#1445)
Naman-Priyadarshi Jun 13, 2022
04e3ffd
Migrate from Azure to GH Actions (#1306)
kandersolar Jun 15, 2022
577f3b5
Accept albedo in weather input to ModelChain.run_model method (#1469)
cwhanse Jun 21, 2022
35af84e
Revert "Accept albedo in weather input to ModelChain.run_model method…
cwhanse Jun 21, 2022
c8b04a5
Update README badges (#1482)
kandersolar Jun 22, 2022
c0daa5d
Move surface orientation calculation from tracking.singleaxis to new …
kandersolar Jul 5, 2022
b265b1c
Gallery example for modeling fixed-tilt arrays with pvfactors (#1470)
kandersolar Jul 5, 2022
f2d14ce
add reference to inverter.sandia_multi (#1479)
cwhanse Jul 5, 2022
c3016f1
Allow read_surfrad to parse http files (#1459)
AdamRJensen Jul 19, 2022
59cbae7
Update GitHub Action workflows to use micromamba (#1493)
jules-ch Jul 19, 2022
3f397ed
pvlib.tracking docstrings updated to indicate limits of axis_tilt (#1…
kurt-rhee Jul 19, 2022
c9bbdbb
Improve nonuniform timestamps error message for prilliman and detect_…
kandersolar Aug 3, 2022
65e6fb3
Drop py3.6, add 3.10; switch CI from macos-10.15 to macos-latest (#1507)
kandersolar Aug 12, 2022
e659a5a
Make leap_day=True default for PSM3 (deprecation) (#1511)
AdamRJensen Aug 15, 2022
7348be8
Update PVGIS docs and test to comply with PVGIS version 5.2 (#1502)
AdamRJensen Aug 15, 2022
4d75e25
Allow read_tmy3 to handle 00:00 timestamps (#1494)
AdamRJensen Aug 16, 2022
b2eca71
Use pep517/518 build system for modern build isolation (#1495)
jules-ch Aug 16, 2022
9144a3b
Add solaranywhere api key to pytest-remote-data.yml (#1531)
AdamRJensen Aug 17, 2022
798799a
Fix PyPI deploy workflow (#1532)
kandersolar Aug 17, 2022
3692427
setup.py: use setuptools.find_namespace_packages() (#1483)
kandersolar Aug 17, 2022
2fcd6ba
Update cams links (#1529)
AdamRJensen Aug 18, 2022
e1393f7
Accept albedo in weather input to ModelChain.run_model method (#1478)
cwhanse Aug 19, 2022
6ffe257
Finalize 0.9.2 (#1534)
kandersolar Aug 19, 2022
6a94e35
Removing the 'closed' kwarg from pd.date_range in gallery examples (…
joaoguilhermeS Aug 29, 2022
ac2cb4b
Lookup altitude (#1518)
nicomt Aug 31, 2022
875aa10
Townsend snow (building on #1251) (#1468)
reepoi Sep 12, 2022
543d97a
Spectral mismatch calculation code (#1524)
adriesse Sep 13, 2022
38fc142
Temperature model parameter translation code (#1463)
adriesse Sep 13, 2022
ce7ddbc
Clarify docstring descriptions of cross-axis slope (#1530)
chiragpachori Sep 14, 2022
73965c2
Finalize 0.9.3 (#1552)
kandersolar Sep 15, 2022
bdbaf4c
Fix code style issues flagged by LGTM (#1559)
chrisorner Sep 27, 2022
b7768b4
Fix assertion in test_irradiance.py::test_get_ground_diffuse_albedo_0…
bowie2211 Oct 21, 2022
dd6062a
Add optional `return_components` parameter to irradiance.haydavies (#…
spaneja Oct 31, 2022
e50def0
Implement irradiance.complete_irradiance with component sum equations…
kperrynrel Nov 1, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .coveragerc
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
[run]
omit = pvlib/_version.py
omit = pvlib/version.py
2 changes: 0 additions & 2 deletions .gitattributes
Original file line number Diff line number Diff line change
@@ -1,4 +1,2 @@
pvlib/version.py export-subst

# reduce the number of merge conflicts
docs/sphinx/source/whatsnew/* merge=union
2 changes: 1 addition & 1 deletion .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,6 @@
- [ ] Adds description and name entries in the appropriate "what's new" file in [`docs/sphinx/source/whatsnew`](https://github.com/pvlib/pvlib-python/tree/master/docs/sphinx/source/whatsnew) for all changes. Includes link to the GitHub Issue with `` :issue:`num` `` or this Pull Request with `` :pull:`num` ``. Includes contributor name and/or GitHub username (link with `` :ghuser:`user` ``).
- [ ] New code is fully documented. Includes [numpydoc](https://numpydoc.readthedocs.io/en/latest/format.html) compliant docstrings, examples, and comments where necessary.
- [ ] Pull request is nearly complete and ready for detailed review.
- [ ] Maintainer: Appropriate GitHub Labels and Milestone are assigned to the Pull Request and linked Issue.
- [ ] Maintainer: Appropriate GitHub Labels (including `remote-data`) and Milestone are assigned to the Pull Request and linked Issue.

<!-- Brief description of the problem and proposed solution (if not already fully described in the issue linked to above): -->
34 changes: 34 additions & 0 deletions .github/workflows/asv_check.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
name: asv

# CI ASV CHECK is aimed to verify that the benchmarks execute without error.
on: [pull_request, push]

jobs:
quick:
runs-on: ubuntu-latest
defaults:
run:
shell: bash -el {0}

steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0

- name: Install Python
uses: actions/setup-python@v3
with:
python-version: '3.9.7'

- name: Install asv
run: pip install asv==0.4.2

- name: Run asv benchmarks
run: |
cd benchmarks
asv machine --yes
asv run HEAD^! --quick --dry-run --show-stderr | sed "/failed$/ s/^/##[error]/" | tee benchmarks.log
if grep "failed" benchmarks.log > /dev/null ; then
exit 1
fi

15 changes: 9 additions & 6 deletions .github/workflows/publish.yml
Original file line number Diff line number Diff line change
@@ -1,11 +1,12 @@
name: Publish distributions to PyPI

# if this workflow is modified to be a generic CI workflow then
# add an if statement to the publish step so it only runs on tags.
on:
pull_request:
push:
branches:
- master
tags:
- "v*"
- "v*"

jobs:
build-n-publish:
Expand All @@ -26,13 +27,15 @@ jobs:
- name: Install build tools
run: |
python -m pip install --upgrade pip
python -m pip install --upgrade setuptools wheel
python -m pip install build

- name: Build packages
run: python setup.py sdist bdist_wheel
run: python -m build

# only publish distribution to PyPI for tagged commits
- name: Publish distribution to PyPI
uses: pypa/gh-action-pypi-publish@master
if: startsWith(github.ref, 'refs/tags/v')
uses: pypa/gh-action-pypi-publish@release/v1
with:
user: __token__
password: ${{ secrets.pypi_password }}
111 changes: 111 additions & 0 deletions .github/workflows/pytest-remote-data.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
# A secondary test job that only runs the iotools tests if explicitly requested
# (for pull requests) or on a push to the master branch.
# Because the iotools tests require GitHub secrets, we need to be careful about
# malicious PRs accessing the secrets and exposing them externally.
#
# We prevent this by only running this workflow when a maintainer has looked
# over the PR's diff and verified that nothing malicious seems to be going on.
# The maintainer then adds the "remote-data" label to the PR, which will then
# trigger this workflow via the combination of the "on: ... types:"
# and "if:" sections below. The first restricts the workflow to only run when
# a label is added to the PR and the second requires one of the PR's labels
# is the "remote-data" label. Technically this is slightly different from
# triggering when the "remote-data" label is added, since it will also trigger
# when "remote-data" is added, then later some other label is added. Maybe
# there's a better way to do this.
#
# But wait, you say! Can't a malicious PR get around this by modifying
# this workflow file and removing the label requirement? I think the answer
# is "no" as long as we trigger the workflow on "pull_request_target" instead
# of the usual "pull_request". The difference is what context the workflow
# runs inside: "pull_request" runs in the context of the fork, where changes
# to the workflow definition will take immediate effect, while "pull_request_target"
# runs in the context of the main pvlib repository, where the original (non-fork)
# workflow definition is used instead. Of course by switching away from the fork's
# context to keep our original workflow definitions, we're also keeping all the
# original code, so the tests won't be run against the PR's new code. To fix this
# we explicitly check out the PR's code as the first step of the workflow.
# This allows the job to run modified pvlib & pytest code, but only ever via
# the original workflow file.
# So long as a maintainer always verifies that the PR's code is not malicious prior to
# adding the label and triggering this workflow, I think this should not present
# a security risk.
#
# Note that this workflow can be triggered again by removing and re-adding the
# "remote-data" label to the PR.
#
# Note also that "pull_request_target" is also the only way for the secrets
# to be accessible in the first place.
#
# Further reading:
# - https://securitylab.github.com/research/github-actions-preventing-pwn-requests/
# - https://github.community/t/can-workflow-changes-be-used-with-pull-request-target/178626/7

name: pytest-remote-data

on:
pull_request_target:
types: [labeled]
push:
branches:
- master

jobs:
test:

strategy:
fail-fast: false # don't cancel other matrix jobs when one fails
matrix:
python-version: [3.7, 3.8, 3.9, "3.10"]
suffix: [''] # the alternative to "-min"
include:
- python-version: 3.7
suffix: -min

runs-on: ubuntu-latest
if: (github.event_name == 'pull_request_target' && contains(github.event.pull_request.labels.*.name, 'remote-data')) || (github.event_name == 'push')

steps:
- uses: actions/checkout@v3
if: github.event_name == 'pull_request_target'
# pull_request_target runs in the context of the target branch (pvlib/master),
# but what we need is the hypothetical merge commit from the PR:
with:
ref: "refs/pull/${{ github.event.number }}/merge"

- uses: actions/checkout@v2
if: github.event_name == 'push'

- name: Set up conda environment
uses: conda-incubator/setup-miniconda@v2
with:
activate-environment: test_env
environment-file: ${{ env.REQUIREMENTS }}
python-version: ${{ matrix.python-version }}
auto-activate-base: false
env:
# build requirement filename. First replacement is for the python
# version, second is to add "-min" if needed
REQUIREMENTS: ci/requirements-py${{ matrix.python-version }}${{ matrix.suffix }}.yml

- name: List installed package versions
shell: bash -l {0} # necessary for conda env to be active
run: conda list

- name: Run tests
shell: bash -l {0} # necessary for conda env to be active
env:
# copy GitHub Secrets into environment variables for the tests to access
NREL_API_KEY: ${{ secrets.NRELAPIKEY }}
SOLARANYWHERE_API_KEY: ${{ secrets.SOLARANYWHERE_API_KEY }}
BSRN_FTP_USERNAME: ${{ secrets.BSRN_FTP_USERNAME }}
BSRN_FTP_PASSWORD: ${{ secrets.BSRN_FTP_PASSWORD }}
run: pytest pvlib/tests/iotools pvlib/tests/test_forecast.py --cov=./ --cov-report=xml --remote-data

- name: Upload coverage to Codecov
if: matrix.python-version == 3.7 && matrix.suffix == ''
uses: codecov/codecov-action@v2
with:
fail_ci_if_error: true
verbose: true
flags: remote-data # flags are configured in codecov.yml
88 changes: 88 additions & 0 deletions .github/workflows/pytest.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
name: pytest

on:
pull_request:
push:
branches:
- master

jobs:
test:
strategy:
fail-fast: false # don't cancel other matrix jobs when one fails
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
python-version: [3.7, 3.8, 3.9, "3.10"]
environment-type: [conda, bare]
suffix: [''] # placeholder as an alternative to "-min"
include:
- os: ubuntu-latest
python-version: 3.7
environment-type: conda
suffix: -min
exclude:
- os: macos-latest
environment-type: conda
- os: windows-latest
environment-type: bare

runs-on: ${{ matrix.os }}

steps:
# We check out only a limited depth and then pull tags to save time
- name: Checkout source
uses: actions/checkout@v3
with:
fetch-depth: 100

- name: Get tags
run: git fetch --depth=1 origin +refs/tags/*:refs/tags/*

- name: Install Conda environment with Micromamba
if: matrix.environment-type == 'conda'
uses: mamba-org/provision-with-micromamba@v12
with:
environment-file: ${{ env.REQUIREMENTS }}
cache-downloads: true
extra-specs: |
python=${{ matrix.python-version }}
env:
# build requirement filename. First replacement is for the python
# version, second is to add "-min" if needed
REQUIREMENTS: ci/requirements-py${{ matrix.python-version }}${{ matrix.suffix }}.yml

- name: List installed package versions (conda)
if: matrix.environment-type == 'conda'
shell: bash -l {0} # necessary for conda env to be active
run: micromamba list

- name: Install bare Python ${{ matrix.python-version }}${{ matrix.suffix }}
if: matrix.environment-type == 'bare'
uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python-version }}

- name: Install pvlib
if: matrix.environment-type == 'conda'
shell: bash -l {0}
run: python -m pip install --no-deps .

- name: Set up bare environment
if: matrix.environment-type == 'bare'
run: |
pip install .[test]
pip freeze

- name: Run tests
shell: bash -l {0} # necessary for conda env to be active
run: |
# ignore iotools & forecast; those tests are run in a separate workflow
pytest pvlib --cov=./ --cov-report=xml --ignore=pvlib/tests/iotools --ignore=pvlib/tests/test_forecast.py

- name: Upload coverage to Codecov
if: matrix.python-version == 3.7 && matrix.suffix == '' && matrix.os == 'ubuntu-latest' && matrix.environment-type == 'conda'
uses: codecov/codecov-action@v2
with:
fail_ci_if_error: true
verbose: true
flags: core # flags are configured in codecov.yml
11 changes: 0 additions & 11 deletions .landscape.yml

This file was deleted.

3 changes: 0 additions & 3 deletions .lgtm.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,3 @@
path_classifiers:
generated:
- pvlib/_version.py
library:
- versioneer.py
- pvlib/_deprecation.py
2 changes: 1 addition & 1 deletion .stickler.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,4 @@ linters:
ignore: E201,E241,E226,W503,W504
files:
ignore:
- 'pvlib/_version.py'
- 'pvlib/version.py'
26 changes: 11 additions & 15 deletions MANIFEST.in
Original file line number Diff line number Diff line change
@@ -1,20 +1,7 @@
include MANIFEST.in
include AUTHORS.md
include LICENSE
include README.md
include setup.py

# include most everything under pvlib by default
# better to package too much than not enough
graft pvlib

# we included pvlib files needed to compile NREL SPA code in graft above,
# now we exclude the NREL code itself to comply with their license
global-exclude */spa.c
global-exclude */spa.h
prune pvlib/spa_c_files/build

graft docs
prune docs/sphinx/build
prune docs/sphinx/source/generated
# all doc figures created by doc build
Expand All @@ -31,5 +18,14 @@ global-exclude .git*
global-exclude \#*
global-exclude .ipynb_checkpoints

include versioneer.py
include pvlib/_version.py
exclude .coveragerc
exclude .lgtm.yml
exclude .stickler.yml
exclude codecov.yml
exclude readthedocs.yml
exclude CODE_OF_CONDUCT.md

prune paper
prune .github
prune benchmarks
prune ci
18 changes: 5 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,11 @@
<a href="http://pvlib-python.readthedocs.org/en/stable/">
<img src="https://readthedocs.org/projects/pvlib-python/badge/?version=stable" alt="documentation build status" />
</a>
<a href="https://dev.azure.com/solararbiter/pvlib%20python/_build/latest?definitionId=4&branchName=master">
<img src="https://dev.azure.com/solararbiter/pvlib%20python/_apis/build/status/pvlib.pvlib-python?branchName=master" alt="Azure Pipelines build status" />
<a href="https://github.com/pvlib/pvlib-python/actions/workflows/pytest.yml?query=branch%3Amaster">
<img src="https://github.com/pvlib/pvlib-python/actions/workflows/pytest.yml/badge.svg?branch=master" alt="GitHub Actions Testing Status" />
</a>
<a href="https://codecov.io/gh/pvlib/pvlib-python">
<img src="https://codecov.io/gh/pvlib/pvlib-python/branch/master/graph/badge.svg" alt="codecov coverage" />
</a>
</td>
</tr>
Expand All @@ -44,17 +47,6 @@
</a>
</td>
</tr>
<tr>
<td>Coverage</td>
<td>
<a href="https://coveralls.io/r/pvlib/pvlib-python">
<img src="https://img.shields.io/coveralls/pvlib/pvlib-python.svg" alt="coveralls coverage" />
</a>
<a href="https://codecov.io/gh/pvlib/pvlib-python">
<img src="https://codecov.io/gh/pvlib/pvlib-python/branch/master/graph/badge.svg" alt="codecov coverage" />
</a>
</td>
</tr>
<tr>
<td>Benchmarks</td>
<td>
Expand Down
Loading