Skip to content

Commit bb081ed

Browse files
mikofskiwholmgren
andcommitted
reorganize tests into subfolders and use pathlib for conftest DATA_DIR (#859)
* reorganize tests into subfolders * closes #848 * create pvlib/tests/iotools and move all iotools tests into that subfolder * use fixtures in test_ecmwf_macc.py for expected_test_data * use conftest.data_dir instead of redundant boilerplate process of retrieving test folder for each tests, mostly for iotools * fix test_ivtools.py was using full path to conftest but tests is not a python package, so no * change "pvlib/test" -> "pvlib/tests" to conform to popular convention Signed-off-by: Mark Mikofski <[email protected]> * in azure pipelines, pytest pvlib only, not pvlib/test * Update v0.7.1.rst * Update .codecov.yml * Update docs/sphinx/source/whatsnew/v0.7.1.rst with suggestion from Will Co-Authored-By: Will Holmgren <[email protected]> * use pathlib, remove os, inspect * some iotools expect string to check for url using startswith() - so stringify path objects first * in midc, use requests.get params arg to build querystring instead of doing it manually * a couple of places chnage constants to all CAPS, some other places make a fixture, sometimes nothing, not consistent * also in test_midc, comment out unused URL, was it part of a test once? * path objects only work seemlessly on Python-3.6 or later * stringify a few more path objects * in psm3.read_psm3 still expects a string filename * tmy3 and tmy2 also could be strings * last two: stringify epw and surfrad * fingers crossed! * fix typo, mention pathlib in what's new Co-authored-by: Will Holmgren <[email protected]>
1 parent 98537bc commit bb081ed

40 files changed

+100
-138
lines changed

.codecov.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,6 @@ coverage:
2222
tests:
2323
target: 95%
2424
paths:
25-
- "pvlib/test/.*"
25+
- "pvlib/tests/.*"
2626

2727
comment: off

azure-pipelines.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -155,7 +155,7 @@ jobs:
155155
- script: |
156156
pip install pytest pytest-cov pytest-mock pytest-timeout pytest-azurepipelines
157157
pip install -e .
158-
pytest pvlib/test --junitxml=junit/test-results.xml --cov=pvlib --cov-report=xml --cov-report=html
158+
pytest pvlib --junitxml=junit/test-results.xml --cov=pvlib --cov-report=xml --cov-report=html
159159
displayName: 'Test with pytest'
160160
161161
- task: PublishTestResults@2

docs/sphinx/source/whatsnew/v0.7.1.rst

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,9 @@ Testing
2626
~~~~~~~
2727
* Added single-year PSM3 API test for `iotools.get_psm3`.
2828
* Added tests for `iotools.parse_psm3` and `iotools.read_psm3`.
29+
* Change `pvlib/test` folder to `pvlib/tests` and reorganize tests into
30+
subfolders, *e.g.*: created `pvlib/tests/iotools` (:pull:`859`)
31+
* replace `os.path` with `pathlib` and stringify path objects for Python<=3.5
2932

3033
Documentation
3134
~~~~~~~~~~~~~

pvlib/iotools/crn.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ def read_crn(filename):
4646
4747
Parameters
4848
----------
49-
filename: str
49+
filename: str, path object, or file-like
5050
filepath or url to read for the fixed-width file.
5151
5252
Returns

pvlib/iotools/epw.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -217,7 +217,7 @@ def read_epw(filename, coerce_year=None):
217217
<https://energyplus.net/documentation>`_
218218
'''
219219

220-
if filename.startswith('http'):
220+
if str(filename).startswith('http'):
221221
# Attempts to download online EPW file
222222
# See comments above for possible online sources
223223
request = Request(filename, headers={'User-Agent': (
@@ -228,7 +228,7 @@ def read_epw(filename, coerce_year=None):
228228
csvdata = io.StringIO(response.read().decode(errors='ignore'))
229229
else:
230230
# Assume it's accessible via the file system
231-
csvdata = open(filename, 'r')
231+
csvdata = open(str(filename), 'r')
232232
try:
233233
data, meta = parse_epw(csvdata, coerce_year)
234234
finally:

pvlib/iotools/midc.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -249,12 +249,12 @@ def read_midc_raw_data_from_nrel(site, start, end, variable_map={},
249249
args = {'site': site,
250250
'begin': start.strftime('%Y%m%d'),
251251
'end': end.strftime('%Y%m%d')}
252-
endpoint = 'https://midcdmz.nrel.gov/apps/data_api.pl?'
253-
url = endpoint + '&'.join(['{}={}'.format(k, v) for k, v in args.items()])
252+
url = 'https://midcdmz.nrel.gov/apps/data_api.pl'
253+
# NOTE: just use requests.get(url, params=args) to build querystring
254254
# number of header columns and data columns do not always match,
255255
# so first parse the header to determine the number of data columns
256256
# to parse
257-
csv_request = requests.get(url, timeout=timeout)
257+
csv_request = requests.get(url, timeout=timeout, params=args)
258258
csv_request.raise_for_status()
259259
raw_csv = io.StringIO(csv_request.text)
260260
first_row = pd.read_csv(raw_csv, nrows=0)

pvlib/iotools/psm3.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -295,6 +295,6 @@ def read_psm3(filename):
295295
.. [2] `Standard Time Series Data File Format
296296
<https://rredc.nrel.gov/solar/old_data/nsrdb/2005-2012/wfcsv.pdf>`_
297297
"""
298-
with open(filename, 'r') as fbuf:
298+
with open(str(filename), 'r') as fbuf:
299299
content = parse_psm3(fbuf)
300300
return content

pvlib/iotools/solrad.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@ def read_solrad(filename):
8181
Program. Bull. Amer. Meteor. Soc., 77, 2857-2864.
8282
:doi:`10.1175/1520-0477(1996)077<2857:TNISIS>2.0.CO;2`
8383
"""
84-
if 'msn' in filename:
84+
if 'msn' in str(filename):
8585
names = MADISON_HEADERS
8686
widths = MADISON_WIDTHS
8787
dtypes = MADISON_DTYPES

pvlib/iotools/surfrad.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -122,12 +122,12 @@ def read_surfrad(filename, map_variables=True):
122122
.. [2] NOAA SURFRAD Data Archive
123123
`SURFRAD Archive <ftp://aftp.cmdl.noaa.gov/data/radiation/surfrad/>`_
124124
"""
125-
if filename.startswith('ftp'):
125+
if str(filename).startswith('ftp'):
126126
req = Request(filename)
127127
response = urlopen(req)
128128
file_buffer = io.StringIO(response.read().decode(errors='ignore'))
129129
else:
130-
file_buffer = open(filename, 'r')
130+
file_buffer = open(str(filename), 'r')
131131

132132
# Read and parse the first two lines to build the metadata dict.
133133
station = file_buffer.readline()

pvlib/iotools/tmy.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -160,7 +160,7 @@ def read_tmy3(filename=None, coerce_year=None, recolumn=True):
160160

161161
head = ['USAF', 'Name', 'State', 'TZ', 'latitude', 'longitude', 'altitude']
162162

163-
if filename.startswith('http'):
163+
if str(filename).startswith('http'):
164164
request = Request(filename, headers={'User-Agent': (
165165
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_5) '
166166
'AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.87 '
@@ -169,7 +169,7 @@ def read_tmy3(filename=None, coerce_year=None, recolumn=True):
169169
csvdata = io.StringIO(response.read().decode(errors='ignore'))
170170
else:
171171
# assume it's accessible via the file system
172-
csvdata = open(filename, 'r')
172+
csvdata = open(str(filename), 'r')
173173

174174
# read in file metadata, advance buffer to second line
175175
firstline = csvdata.readline()
@@ -409,7 +409,7 @@ def read_tmy2(filename):
409409
columns = 'year,month,day,hour,ETR,ETRN,GHI,GHISource,GHIUncertainty,DNI,DNISource,DNIUncertainty,DHI,DHISource,DHIUncertainty,GHillum,GHillumSource,GHillumUncertainty,DNillum,DNillumSource,DNillumUncertainty,DHillum,DHillumSource,DHillumUncertainty,Zenithlum,ZenithlumSource,ZenithlumUncertainty,TotCld,TotCldSource,TotCldUncertainty,OpqCld,OpqCldSource,OpqCldUncertainty,DryBulb,DryBulbSource,DryBulbUncertainty,DewPoint,DewPointSource,DewPointUncertainty,RHum,RHumSource,RHumUncertainty,Pressure,PressureSource,PressureUncertainty,Wdir,WdirSource,WdirUncertainty,Wspd,WspdSource,WspdUncertainty,Hvis,HvisSource,HvisUncertainty,CeilHgt,CeilHgtSource,CeilHgtUncertainty,PresentWeather,Pwat,PwatSource,PwatUncertainty,AOD,AODSource,AODUncertainty,SnowDepth,SnowDepthSource,SnowDepthUncertainty,LastSnowfall,LastSnowfallSource,LastSnowfallUncertaint' # noqa: E501
410410
hdr_columns = 'WBAN,City,State,TZ,latitude,longitude,altitude'
411411

412-
tmy2, tmy2_meta = _read_tmy2(string, columns, hdr_columns, filename)
412+
tmy2, tmy2_meta = _read_tmy2(string, columns, hdr_columns, str(filename))
413413

414414
return tmy2, tmy2_meta
415415

0 commit comments

Comments
 (0)