Skip to content

Commit

Permalink
Merge branch 'main' into update-cftime-frequency-strings
Browse files Browse the repository at this point in the history
  • Loading branch information
dcherian authored Nov 15, 2023
2 parents d154ba9 + 1411474 commit 6c1995e
Show file tree
Hide file tree
Showing 49 changed files with 700 additions and 999 deletions.
8 changes: 3 additions & 5 deletions .binder/environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,10 @@ name: xarray-examples
channels:
- conda-forge
dependencies:
- python=3.9
- python=3.10
- boto3
- bottleneck
- cartopy
- cdms2
- cfgrib
- cftime
- coveralls
Expand All @@ -25,7 +24,7 @@ dependencies:
- numpy
- packaging
- pandas
- pint
- pint>=0.22
- pip
- pooch
- pydap
Expand All @@ -38,5 +37,4 @@ dependencies:
- toolz
- xarray
- zarr
- pip:
- numbagg
- numbagg
8 changes: 1 addition & 7 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -67,13 +67,7 @@ jobs:
run: |
echo "TODAY=$(date +'%Y-%m-%d')" >> $GITHUB_ENV
if [[ "${{matrix.python-version}}" == "3.11" ]]; then
if [[ ${{matrix.os}} == windows* ]]; then
echo "CONDA_ENV_FILE=ci/requirements/environment-windows-py311.yml" >> $GITHUB_ENV
else
echo "CONDA_ENV_FILE=ci/requirements/environment-py311.yml" >> $GITHUB_ENV
fi
elif [[ ${{ matrix.os }} == windows* ]] ;
if [[ ${{ matrix.os }} == windows* ]] ;
then
echo "CONDA_ENV_FILE=ci/requirements/environment-windows.yml" >> $GITHUB_ENV
elif [[ "${{ matrix.env }}" != "" ]] ;
Expand Down
12 changes: 6 additions & 6 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ ci:
autoupdate_schedule: monthly
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
rev: v4.5.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
Expand All @@ -18,24 +18,24 @@ repos:
files: ^xarray/
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: 'v0.0.292'
rev: 'v0.1.4'
hooks:
- id: ruff
args: ["--fix"]
# https://github.com/python/black#version-control-integration
- repo: https://github.com/psf/black
rev: 23.9.1
rev: 23.10.1
hooks:
- id: black-jupyter
- repo: https://github.com/keewis/blackdoc
rev: v0.3.8
rev: v0.3.9
hooks:
- id: blackdoc
exclude: "generate_aggregations.py"
additional_dependencies: ["black==23.9.1"]
additional_dependencies: ["black==23.10.1"]
- id: blackdoc-autoupdate-black
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.5.1
rev: v1.6.1
hooks:
- id: mypy
# Copied from setup.cfg
Expand Down
5 changes: 1 addition & 4 deletions ci/requirements/all-but-dask.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,11 @@ channels:
- conda-forge
- nodefaults
dependencies:
- python=3.10
- black
- aiobotocore
- boto3
- bottleneck
- cartopy
- cdms2
- cftime
- coveralls
- flox
Expand All @@ -26,9 +24,8 @@ dependencies:
- numpy
- packaging
- pandas
- pint<0.21
- pint>=0.22
- pip
- pseudonetcdf
- pydap
- pytest
- pytest-cov
Expand Down
48 changes: 0 additions & 48 deletions ci/requirements/environment-py311.yml

This file was deleted.

44 changes: 0 additions & 44 deletions ci/requirements/environment-windows-py311.yml

This file was deleted.

4 changes: 1 addition & 3 deletions ci/requirements/environment-windows.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@ dependencies:
- boto3
- bottleneck
- cartopy
# - cdms2 # Not available on Windows
- cftime
- dask-core
- distributed
Expand All @@ -25,10 +24,9 @@ dependencies:
- numpy
- packaging
- pandas
- pint<0.21
- pint>=0.22
- pip
- pre-commit
- pseudonetcdf
- pydap
- pytest
- pytest-cov
Expand Down
4 changes: 1 addition & 3 deletions ci/requirements/environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@ dependencies:
- boto3
- bottleneck
- cartopy
- cdms2
- cftime
- dask-core
- distributed
Expand All @@ -29,11 +28,10 @@ dependencies:
- opt_einsum
- packaging
- pandas
- pint<0.21
- pint>=0.22
- pip
- pooch
- pre-commit
- pseudonetcdf
- pydap
- pytest
- pytest-cov
Expand Down
4 changes: 1 addition & 3 deletions ci/requirements/min-all-deps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,6 @@ dependencies:
- boto3=1.24
- bottleneck=1.3
- cartopy=0.20
- cdms2=3.1
- cftime=1.6
- coveralls
- dask-core=2022.7
Expand All @@ -35,9 +34,8 @@ dependencies:
- numpy=1.22
- packaging=21.3
- pandas=1.4
- pint=0.19
- pint=0.22
- pip
- pseudonetcdf=3.2
- pydap=3.3
- pytest
- pytest-cov
Expand Down
14 changes: 0 additions & 14 deletions doc/api-hidden.rst
Original file line number Diff line number Diff line change
Expand Up @@ -591,20 +591,6 @@
backends.H5netcdfBackendEntrypoint.guess_can_open
backends.H5netcdfBackendEntrypoint.open_dataset

backends.PseudoNetCDFDataStore.close
backends.PseudoNetCDFDataStore.get_attrs
backends.PseudoNetCDFDataStore.get_dimensions
backends.PseudoNetCDFDataStore.get_encoding
backends.PseudoNetCDFDataStore.get_variables
backends.PseudoNetCDFDataStore.open
backends.PseudoNetCDFDataStore.open_store_variable
backends.PseudoNetCDFDataStore.ds

backends.PseudoNetCDFBackendEntrypoint.description
backends.PseudoNetCDFBackendEntrypoint.url
backends.PseudoNetCDFBackendEntrypoint.guess_can_open
backends.PseudoNetCDFBackendEntrypoint.open_dataset

backends.PydapDataStore.close
backends.PydapDataStore.get_attrs
backends.PydapDataStore.get_dimensions
Expand Down
7 changes: 2 additions & 5 deletions doc/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -557,6 +557,7 @@ Datetimelike properties
DataArray.dt.seconds
DataArray.dt.microseconds
DataArray.dt.nanoseconds
DataArray.dt.total_seconds

**Timedelta methods**:

Expand Down Expand Up @@ -602,7 +603,7 @@ Dataset methods
Dataset.as_numpy
Dataset.from_dataframe
Dataset.from_dict
Dataset.to_array
Dataset.to_dataarray
Dataset.to_dataframe
Dataset.to_dask_dataframe
Dataset.to_dict
Expand All @@ -627,11 +628,9 @@ DataArray methods
load_dataarray
open_dataarray
DataArray.as_numpy
DataArray.from_cdms2
DataArray.from_dict
DataArray.from_iris
DataArray.from_series
DataArray.to_cdms2
DataArray.to_dask_dataframe
DataArray.to_dataframe
DataArray.to_dataset
Expand Down Expand Up @@ -1116,7 +1115,6 @@ arguments for the ``load_store`` and ``dump_to_store`` Dataset methods:

backends.NetCDF4DataStore
backends.H5NetCDFStore
backends.PseudoNetCDFDataStore
backends.PydapDataStore
backends.ScipyDataStore
backends.ZarrStore
Expand All @@ -1132,7 +1130,6 @@ used filetypes in the xarray universe.

backends.NetCDF4BackendEntrypoint
backends.H5netcdfBackendEntrypoint
backends.PseudoNetCDFBackendEntrypoint
backends.PydapBackendEntrypoint
backends.ScipyBackendEntrypoint
backends.StoreBackendEntrypoint
Expand Down
1 change: 1 addition & 0 deletions doc/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -327,6 +327,7 @@
"sparse": ("https://sparse.pydata.org/en/latest/", None),
"cubed": ("https://tom-e-white.com/cubed/", None),
"datatree": ("https://xarray-datatree.readthedocs.io/en/latest/", None),
"xarray-tutorial": ("https://tutorial.xarray.dev/", None),
# "opt_einsum": ("https://dgasmith.github.io/opt_einsum/", None),
}

Expand Down
13 changes: 3 additions & 10 deletions doc/getting-started-guide/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -168,18 +168,11 @@ integration with Cartopy_.
.. _Iris: https://scitools-iris.readthedocs.io/en/stable/
.. _Cartopy: https://scitools.org.uk/cartopy/docs/latest/

`UV-CDAT`__ is another Python library that implements in-memory netCDF-like
variables and `tools for working with climate data`__.

__ https://uvcdat.llnl.gov/
__ https://drclimate.wordpress.com/2014/01/02/a-beginners-guide-to-scripting-with-uv-cdat/

We think the design decisions we have made for xarray (namely, basing it on
pandas) make it a faster and more flexible data analysis tool. That said, Iris
and CDAT have some great domain specific functionality, and xarray includes
methods for converting back and forth between xarray and these libraries. See
:py:meth:`~xarray.DataArray.to_iris` and :py:meth:`~xarray.DataArray.to_cdms2`
for more details.
has some great domain specific functionality, and xarray includes
methods for converting back and forth between xarray and Iris. See
:py:meth:`~xarray.DataArray.to_iris` for more details.

What other projects leverage xarray?
------------------------------------
Expand Down
3 changes: 0 additions & 3 deletions doc/getting-started-guide/installing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -38,9 +38,6 @@ For netCDF and IO
- `cftime <https://unidata.github.io/cftime>`__: recommended if you
want to encode/decode datetimes for non-standard calendars or dates before
year 1678 or after year 2262.
- `PseudoNetCDF <http://github.com/barronh/pseudonetcdf/>`__: recommended
for accessing CAMx, GEOS-Chem (bpch), NOAA ARL files, ICARTT files
(ffi1001) and many other.
- `iris <https://github.com/scitools/iris>`__: for conversion to and from iris'
Cube objects

Expand Down
2 changes: 1 addition & 1 deletion doc/howdoi.rst
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ How do I ...
* - rename a variable, dimension or coordinate
- :py:meth:`Dataset.rename`, :py:meth:`DataArray.rename`, :py:meth:`Dataset.rename_vars`, :py:meth:`Dataset.rename_dims`,
* - convert a DataArray to Dataset or vice versa
- :py:meth:`DataArray.to_dataset`, :py:meth:`Dataset.to_array`, :py:meth:`Dataset.to_stacked_array`, :py:meth:`DataArray.to_unstacked_dataset`
- :py:meth:`DataArray.to_dataset`, :py:meth:`Dataset.to_dataarray`, :py:meth:`Dataset.to_stacked_array`, :py:meth:`DataArray.to_unstacked_dataset`
* - extract variables that have certain attributes
- :py:meth:`Dataset.filter_by_attrs`
* - extract the underlying array (e.g. NumPy or Dask arrays)
Expand Down
Loading

0 comments on commit 6c1995e

Please sign in to comment.