Skip to content

Commit

Permalink
Merge pull request #830 from dopplershift/cleanup
Browse files Browse the repository at this point in the history
Miscellaneous Cleanups
  • Loading branch information
dcamron authored Nov 15, 2024
2 parents 489a81a + 74b01ef commit f0bc4a8
Show file tree
Hide file tree
Showing 23 changed files with 239 additions and 172 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -52,3 +52,4 @@ nosetests.xml

# Translations
*.mo
docs/sg_execution_times.rst
6 changes: 3 additions & 3 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ Add tests for your change(s). Make the tests pass:

py.test

Commit the changes you made. Chris Beams has written a [guide](https://chris.beams.io/posts/git-commit/) on how to write good commit messages.
Commit the changes you made. Chris Beams has written a [guide](https://cbea.ms/git-commit/) on how to write good commit messages.

Push to your fork and [submit a pull request][pr].

Expand Down Expand Up @@ -111,8 +111,8 @@ Some things that will increase the chance that your pull request is accepted:
* Follow [PEP8][pep8] for style. (The `flake8` utility can help with this.)
* Write a [good commit message][commit].

Pull requests will automatically have tests run by Travis. This includes
running both the unit tests as well as the `flake8` code linter.
Pull requests will automatically have tests run by GitHub Actions. This includes
running both the unit tests as well as the `flake8` and `ruff` code linters.

[pep8]: https://pep8.org
[commit]: https://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html
Expand Down
2 changes: 1 addition & 1 deletion README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Siphon

|Docs| |PyPI| |Conda|

|Travis| |AppVeyor| |CodeCov|
|CodeCov|

|Codacy|

Expand Down
16 changes: 0 additions & 16 deletions TODO.md

This file was deleted.

14 changes: 14 additions & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,8 +58,15 @@
'numpy': ('https://numpy.org/doc/stable/', None),
'matplotlib': ('https://matplotlib.org/stable/', None),
'requests': ('https://requests.kennethreitz.org/en/latest/', None),
'pandas': ('https://pandas.pydata.org/docs/', None),
}

nitpicky = True
nitpick_ignore = [
('py:class', 'optional'), ('py:class', 'file-like object'), ('py:class', 'iterator')
]
nitpick_ignore_regex = [('py:class', r'.*[cC]allable'),]

# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']

Expand Down Expand Up @@ -304,3 +311,10 @@

# If true, do not generate a @detailmenu in the "Top" node's menu.
#texinfo_no_detailmenu = False

# Dictionary of URL redirects allowed
linkcheck_allowed_redirects = {
r'https://doi.org/.*': r'https://.*',
r'https://gitter.im/Unidata/siphon': r'https://app.gitter.im/.*siphon.*',
r'https://codecov.io/github/Unidata/siphon': r'https://app.codecov.io/github/Unidata/siphon',
}
23 changes: 15 additions & 8 deletions docs/developerguide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ Code Style
----------

Siphon uses the Python code style outlined in `PEP8
<https://www.python.org/dev/peps/pep-0008/>`_. For better or worse, this is what the majority
<https://peps.python.org/pep-0008/>`_. For better or worse, this is what the majority
of the Python world uses. The one deviation is that line length limit is 95 characters. 80 is a
good target, but some times longer lines are needed.

Expand All @@ -107,7 +107,7 @@ generated from docstrings, written using the
There are also examples in the ``examples/`` directory.

The documentation is hosted on `GitHub Pages <https://unidata.github.io/siphon>`_. The docs are
built automatically from ``main`` with every build on Travis-CI; every merged PR will
built automatically from ``main`` with every build on GitHub Actions; every merged PR will
have the built docs upload to GitHub Pages. As part of the build, the documentation is also
checked with ``doc8``. To see what the docs will look like, you also need to install the
``sphinx-rtd-theme`` package.
Expand All @@ -116,12 +116,19 @@ checked with ``doc8``. To see what the docs will look like, you also need to ins
Other Tools
-----------

Continuous integration is performed by `Travis CI <https://www.travis-ci.org/Unidata/siphon>`_.
This service runs the unit tests on all support versions, as well as runs against the minimum
package versions. ``flake8`` is also run against the code to check formatting. Travis is also
used to build the documentation and to run the examples to ensure they stay working.

Test coverage is monitored by `Codecov.io <https://codecov.io/github/Unidata/siphon>`_.
Continuous integration is performed by
`GitHub Actions <https://github.com/Unidata/siphon/actions>`_.
This integration runs the unit tests on Linux for all supported versions of Python, as well
as runs against the minimum package versions, using PyPI packages. This also runs against
a (non-exhaustive) matrix of python versions on macOS and Windows. In addition to these tests,
GitHub actions also builds the documentation and runs the examples across multiple platforms
and Python versions, as well as checks for any broken web links. ``flake8`` (along with a
variety of plugins found in ``ci/linting.txt``) and ``ruff`` are also run against the code to
check formatting using another job on GitHub Actions. As part of this linting job, the docs
are also checked using the ``doc8`` tool, and spelling is checked using ``codespell``.
Configurations for these are in a variety of files in ``.github/workflows``.

Test coverage is monitored by `codecov.io <https://codecov.io/github/Unidata/siphon>`_.

---------
Releasing
Expand Down
4 changes: 2 additions & 2 deletions examples/Radar_Server_Level_3.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
Use Siphon to get NEXRAD Level 3 data from a TDS.
"""
from datetime import datetime
from datetime import datetime, timezone

import matplotlib.pyplot as plt
import numpy as np
Expand Down Expand Up @@ -38,7 +38,7 @@
# N0B, which is reflectivity data for the lowest tilt. We see that when the query
# is represented as a string, it shows the encoded URL.
query = rs.query()
query.stations('CYS').time(datetime.utcnow()).variables('N0B')
query.stations('CYS').time(datetime.now(timezone.utc)).variables('N0B')

###########################################
# We can use the RadarServer instance to check our query, to make
Expand Down
6 changes: 3 additions & 3 deletions examples/ncss/NCSS_Cartopy_Example.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
This example uses Siphon's NCSS class to provide temperature data
for contouring a basic map using CartoPy.
"""
from datetime import datetime
from datetime import datetime, timezone

import cartopy.crs as ccrs
import cartopy.feature as cfeature
Expand Down Expand Up @@ -49,13 +49,13 @@
# will return all surface temperatures for points in our bounding box for a single time,
# nearest to that requested. Note the string representation of the query is a properly encoded
# query string.
query.lonlat_box(north=43, south=35, east=-100, west=-111).time(datetime.utcnow())
query.lonlat_box(north=43, south=35, east=-100, west=-111).time(datetime.now(timezone.utc))
query.accept('netcdf4')
query.variables('Temperature_surface')

###########################################
# We now request data from the server using this query. The `NCSS` class handles parsing
# this NetCDF data (using the `netCDF4` module). If we print out the variable names, we see
# this NetCDF data (using the ``netCDF4`` module). If we print out the variable names, we see
# our requested variable, as well as the coordinate variables (needed to properly reference
# the data).
data = ncss.get_data(query)
Expand Down
6 changes: 3 additions & 3 deletions examples/ncss/NCSS_Example.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
Use Siphon to query the NetCDF Subset Service (NCSS).
"""
from datetime import datetime
from datetime import datetime, timezone

import matplotlib.pyplot as plt

Expand Down Expand Up @@ -38,13 +38,13 @@
# 'Temperature_isobaric' and 'Relative_humidity_isobaric'. This request will return all
# vertical levels for a single point and single time. Note the string representation of
# the query is a properly encoded query string.
query.lonlat_point(-105, 40).time(datetime.utcnow())
query.lonlat_point(-105, 40).time(datetime.now(timezone.utc))
query.accept('netcdf4')
query.variables('Temperature_isobaric', 'Relative_humidity_isobaric')

###########################################
# We now request data from the server using this query. The `NCSS` class handles parsing
# this NetCDF data (using the `netCDF4` module). If we print out the variable names,
# this NetCDF data (using the ``netCDF4`` module). If we print out the variable names,
# we see our requested variables, as well as a few others (more metadata information)
data = ncss.get_data(query)
list(data.variables)
Expand Down
8 changes: 4 additions & 4 deletions examples/ncss/NCSS_Timeseries_Examples.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
Use Siphon to query the NetCDF Subset Service for a timeseries.
"""
from datetime import datetime, timedelta
from datetime import datetime, timedelta, timezone

import matplotlib.pyplot as plt
from netCDF4 import num2date
Expand Down Expand Up @@ -39,13 +39,13 @@
# 'Temperature_isobaric', at the vertical level of 100000 Pa (approximately surface).
# This request will return all times in the range for a single point. Note the string
# representation of the query is a properly encoded query string.
now = datetime.utcnow()
now = datetime.now(timezone.utc)
query.lonlat_point(-105, 40).vertical_level(100000).time_range(now, now + timedelta(days=7))
query.variables('Temperature_isobaric').accept('netcdf')

###########################################
# We now request data from the server using this query. The `NCSS` class handles parsing
# this NetCDF data (using the `netCDF4` module). If we print out the variable names, we
# this NetCDF data (using the ``netCDF4`` module). If we print out the variable names, we
# see our requested variables, as well as a few others (more metadata information)
data = ncss.get_data(query)
list(data.variables)
Expand All @@ -57,7 +57,7 @@

###########################################
# The time values are in hours relative to the start of the entire model collection.
# Fortunately, the `netCDF4` module has a helper function to convert these numbers into
# Fortunately, the ``netCDF4`` module has a helper function to convert these numbers into
# Python `datetime` objects. We can see the first 5 element output by the function look
# reasonable.
time_vals = num2date(time[:].squeeze(), time.units, only_use_cftime_datetimes=False)
Expand Down
7 changes: 6 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ test = [
"siphon[extras]"
]
extras = [
"xarray>=2022.3.0"
"xarray>=2022.6.0"
]

[project.urls]
Expand Down Expand Up @@ -87,6 +87,11 @@ combine_star = true
[tool.pytest.ini_options]
norecursedirs = "build docs .idea"
doctest_optionflags = "NORMALIZE_WHITESPACE"
xfail_strict = true
filterwarnings = [
"error",
"ignore:numpy.ndarray size changed:RuntimeWarning",
]

[tool.ruff]
line-length = 95
Expand Down
7 changes: 5 additions & 2 deletions src/siphon/catalog.py
Original file line number Diff line number Diff line change
Expand Up @@ -101,6 +101,7 @@ def filter_time_nearest(self, time, regex=None, strptime=None):
Returns
-------
Dataset
The value with a time closest to that desired
"""
Expand Down Expand Up @@ -137,6 +138,7 @@ def filter_time_range(self, start, end, regex=None, strptime=None):
Returns
-------
List[Dataset]
All values corresponding to times within the specified range
"""
Expand Down Expand Up @@ -611,7 +613,7 @@ def remote_open(self, mode='b', encoding='ascii', errors='ignore'):
Parameters
----------
mode : 'b' or 't', optional
mode : `'b'` or `'t'`, optional
Mode with which to open the remote data; 'b' for binary, 't' for text. Defaults
to 'b'.
Expand All @@ -625,7 +627,8 @@ def remote_open(self, mode='b', encoding='ascii', errors='ignore'):
Returns
-------
A random access, file-like object
fobj : file-like object
A random access, file-like object for reading data
"""
fobj = self.access_with_service('HTTPServer')
Expand Down
14 changes: 7 additions & 7 deletions src/siphon/http_util.py
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ def urlopen(self, url, decompress=False, **kwargs):
url : str
The URL to request
kwargs : arbitrary keyword arguments
kwargs
Additional keyword arguments to pass to :meth:`requests.Session.get`.
Returns
Expand Down Expand Up @@ -140,14 +140,14 @@ class DataQuery:
This object provides a clear API to formulate a query for data, including
a spatial query, a time query, and possibly some variables or other parameters.
These objects provide a dictionary-like interface, (:meth:`items` and :meth:`__iter__`)
These objects provide a dictionary-like interface, (``items`` and ``__iter__``)
sufficient to be passed to functions expecting a dictionary representing a URL query.
Instances of this object can also be turned into a string, which will yield a
properly escaped string for a URL.
"""

def __init__(self):
"""Construct an empty :class:`DataQuery`."""
"""Construct an empty class representing a query for data."""
self.var = set()
self.time_query = OrderedDict()
self.spatial_query = OrderedDict()
Expand All @@ -163,7 +163,7 @@ def variables(self, *var_names):
Parameters
----------
var_names : one or more strings
var_names : str
One or more names of variables to request. Use 'all' to request all.
Returns
Expand All @@ -183,7 +183,7 @@ def add_query_parameter(self, **kwargs):
Parameters
----------
kwargs : one or more strings passed as keyword arguments
kwargs
Names and values of parameters to add to the query
Returns
Expand Down Expand Up @@ -471,7 +471,7 @@ def get(self, path, params=None):
Raises
------
HTTPError
`~requests.HTTPError`
If the server returns anything other than a 200 (OK) code
See Also
Expand Down Expand Up @@ -506,7 +506,7 @@ def validate_query(self, query):
Parameters
----------
query : DataQuery (or subclass)
query : DataQuery
Returns
-------
Expand Down
2 changes: 1 addition & 1 deletion src/siphon/ncss.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ class NCSS(HTTPEndPoint):
"""

# Need staticmethod to keep this from becoming a bound method, where self
# is passed implicitly
# is passed implicitly. Needed to avoid warning about duplicated docstring.
unit_handler = staticmethod(lambda *a, **kw: default_unit_handler(*a, **kw))

def _get_metadata(self):
Expand Down
4 changes: 2 additions & 2 deletions src/siphon/radarserver.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ def stations(self, *stns):
Parameters
----------
stns : one or more strings
stns : str
One or more names of variables to request
Returns
Expand Down Expand Up @@ -192,7 +192,7 @@ def get_radarserver_datasets(server):
Parameters
----------
server : string
server : str
The base URL to the THREDDS server
Returns
Expand Down
5 changes: 3 additions & 2 deletions src/siphon/simplewebservice/acis.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,11 +37,12 @@ def acis_request(method, params):
Returns
-------
A dictionary of data based on the JSON parameters
dict[str, Any]
A dictionary of data based on the JSON parameters
Raises
------
:class: `ACIS_API_Exception`
`AcisApiException`
When the API is unable to establish a connection or returns
unparsable data.
Expand Down
Loading

0 comments on commit f0bc4a8

Please sign in to comment.