Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bump minimum versions, drop py38 #7461

Merged
merged 17 commits into from
Jan 26, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 4 additions & 6 deletions .github/workflows/ci-additional.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -134,18 +134,16 @@ jobs:
name: codecov-umbrella
fail_ci_if_error: false

mypy38:
name: Mypy 3.8
mypy39:
jhamman marked this conversation as resolved.
Show resolved Hide resolved
name: Mypy 3.9
runs-on: "ubuntu-latest"
needs: detect-ci-trigger
# temporarily skipping due to https://github.com/pydata/xarray/issues/6551
if: needs.detect-ci-trigger.outputs.triggered == 'false'
defaults:
run:
shell: bash -l {0}
env:
CONDA_ENV_FILE: ci/requirements/environment.yml
PYTHON_VERSION: "3.8"
PYTHON_VERSION: "3.9"

steps:
- uses: actions/checkout@v3
Expand Down Expand Up @@ -185,7 +183,7 @@ jobs:
uses: codecov/[email protected]
with:
file: mypy_report/cobertura.xml
flags: mypy38
flags: mypy39
env_vars: PYTHON_VERSION
name: codecov-umbrella
fail_ci_if_error: false
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -42,15 +42,15 @@ jobs:
matrix:
os: ["ubuntu-latest", "macos-latest", "windows-latest"]
# Bookend python versions
python-version: ["3.8", "3.10", "3.11"]
python-version: ["3.9", "3.10", "3.11"]
env: [""]
include:
# Minimum python version:
- env: "bare-minimum"
python-version: "3.8"
python-version: "3.9"
os: ubuntu-latest
- env: "min-all-deps"
python-version: "3.8"
python-version: "3.9"
os: ubuntu-latest
# Latest python version:
- env: "all-but-dask"
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/pypi-release.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ jobs:
- uses: actions/setup-python@v4
name: Install Python
with:
python-version: 3.8
python-version: "3.11"

- name: Install dependencies
run: |
Expand Down Expand Up @@ -53,7 +53,7 @@ jobs:
- uses: actions/setup-python@v4
name: Install Python
with:
python-version: 3.8
python-version: "3.11"
- uses: actions/download-artifact@v3
with:
name: releases
Expand Down
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ repos:
hooks:
- id: pyupgrade
args:
- "--py38-plus"
- "--py39-plus"
# https://github.com/python/black#version-control-integration
- repo: https://github.com/psf/black
rev: 22.12.0
Expand Down
2 changes: 1 addition & 1 deletion asv_bench/asv.conf.json
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@

// The Pythons you'd like to test against. If not provided, defaults
// to the current version of Python used to run `asv`.
"pythons": ["3.8"],
"pythons": ["3.10"],

// The matrix of dependencies to test. Each key is the name of a
// package (in PyPI) and the values are version numbers. An empty
Expand Down
12 changes: 7 additions & 5 deletions ci/min_deps_check.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,13 @@
#!/usr/bin/env python
"""Fetch from conda database all available versions of the xarray dependencies and their
publication date. Compare it against requirements/py37-min-all-deps.yml to verify the
policy on obsolete dependencies is being followed. Print a pretty report :)
"""
import itertools
import sys
from collections.abc import Iterator
from datetime import datetime
from typing import Dict, Iterator, Optional, Tuple
from typing import Optional

import conda.api # type: ignore[import]
import yaml
Expand All @@ -29,7 +31,7 @@

POLICY_MONTHS = {"python": 24, "numpy": 18}
POLICY_MONTHS_DEFAULT = 12
POLICY_OVERRIDE: Dict[str, Tuple[int, int]] = {}
POLICY_OVERRIDE: dict[str, tuple[int, int]] = {}
errors = []


Expand All @@ -43,7 +45,7 @@ def warning(msg: str) -> None:
print("WARNING:", msg)


def parse_requirements(fname) -> Iterator[Tuple[str, int, int, Optional[int]]]:
def parse_requirements(fname) -> Iterator[tuple[str, int, int, Optional[int]]]:
"""Load requirements/py37-min-all-deps.yml

Yield (package name, major version, minor version, [patch version])
Expand Down Expand Up @@ -75,7 +77,7 @@ def parse_requirements(fname) -> Iterator[Tuple[str, int, int, Optional[int]]]:
raise ValueError("expected major.minor or major.minor.patch: " + row)


def query_conda(pkg: str) -> Dict[Tuple[int, int], datetime]:
def query_conda(pkg: str) -> dict[tuple[int, int], datetime]:
"""Query the conda repository for a specific package

Return map of {(major version, minor version): publication date}
Expand Down Expand Up @@ -115,7 +117,7 @@ def metadata(entry):

def process_pkg(
pkg: str, req_major: int, req_minor: int, req_patch: Optional[int]
) -> Tuple[str, str, str, str, str, str]:
) -> tuple[str, str, str, str, str, str]:
"""Compare package version from requirements file to available versions in conda.
Return row to build pandas dataframe:

Expand Down
6 changes: 3 additions & 3 deletions ci/requirements/bare-minimum.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,13 @@ channels:
- conda-forge
- nodefaults
dependencies:
- python=3.8
- python=3.9
- coveralls
- pip
- pytest
- pytest-cov
- pytest-env
- pytest-xdist
- numpy=1.20
- numpy=1.21
- packaging=21.3
- pandas=1.3
- pandas=1.4
12 changes: 6 additions & 6 deletions ci/requirements/doc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@ dependencies:
- bottleneck
- cartopy
- cfgrib>=0.9
- dask-core>=2.30
- h5netcdf>=0.7.4
- dask-core>=2022.1
- h5netcdf>=0.13
- ipykernel
- ipython
- iris>=2.3
Expand All @@ -18,9 +18,9 @@ dependencies:
- nbsphinx
- netcdf4>=1.5
- numba
- numpy>=1.20,<1.24
- packaging>=21.0
- pandas>=1.3
- numpy>=1.21,<1.24
- packaging>=21.3
- pandas>=1.4
- pooch
- pip
- pydata-sphinx-theme>=0.4.3
Expand All @@ -35,7 +35,7 @@ dependencies:
- sphinx-copybutton
- sphinx-design
- sphinx!=4.4.0
- zarr>=2.4
- zarr>=2.10
- pip:
- sphinxext-rediraffe
- sphinxext-opengraph
Expand Down
16 changes: 8 additions & 8 deletions ci/requirements/min-all-deps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,35 +7,35 @@ dependencies:
# Run ci/min_deps_check.py to verify that this file respects the policy.
# When upgrading python, numpy, or pandas, must also change
# doc/user-guide/installing.rst, doc/user-guide/plotting.rst and setup.py.
- python=3.8
- python=3.9
- boto3=1.20
- bottleneck=1.3
- cartopy=0.20
- cdms2=3.1
- cfgrib=0.9
- cftime=1.5
- coveralls
- dask-core=2021.11
- distributed=2021.11
- dask-core=2022.1
- distributed=2022.1
- flox=0.5
- h5netcdf=0.11
- h5netcdf=0.13
dcherian marked this conversation as resolved.
Show resolved Hide resolved
# h5py and hdf5 tend to cause conflicts
# for e.g. hdf5 1.12 conflicts with h5py=3.1
# prioritize bumping other packages instead
- h5py=3.6
- hdf5=1.12
- hypothesis
- iris=3.1
- lxml=4.6 # Optional dep of pydap
- lxml=4.7 # Optional dep of pydap
- matplotlib-base=3.5
- nc-time-axis=1.4
# netcdf follows a 1.major.minor[.patch] convention
# (see https://github.com/Unidata/netcdf4-python/issues/1090)
- netcdf4=1.5.7
- numba=0.54
- numpy=1.20
- numba=0.55
- numpy=1.21
- packaging=21.3
- pandas=1.3
- pandas=1.4
- pint=0.18
- pip
- pseudonetcdf=3.2
Expand Down
17 changes: 9 additions & 8 deletions doc/contributing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,7 @@ We'll now kick off a two-step process:
.. code-block:: sh

# Create and activate the build environment
conda create -c conda-forge -n xarray-tests python=3.8
conda create -c conda-forge -n xarray-tests python=3.10

# This is for Linux and MacOS
conda env update -f ci/requirements/environment.yml
Expand Down Expand Up @@ -571,9 +571,9 @@ A test run of this yields

((xarray) $ pytest test_cool_feature.py -v
=============================== test session starts ================================
platform darwin -- Python 3.6.4, pytest-3.2.1, py-1.4.34, pluggy-0.4.0 --
cachedir: ../../.cache
plugins: cov-2.5.1, hypothesis-3.23.0
platform darwin -- Python 3.10.6, pytest-7.2.0, pluggy-1.0.0 --
cachedir: .pytest_cache
plugins: hypothesis-6.56.3, cov-4.0.0
collected 11 items

test_cool_feature.py::test_dtypes[int8] PASSED
Expand All @@ -599,7 +599,9 @@ which match ``int8``.

((xarray) bash-3.2$ pytest test_cool_feature.py -v -k int8
=========================== test session starts ===========================
platform darwin -- Python 3.6.2, pytest-3.2.1, py-1.4.31, pluggy-0.4.0
platform darwin -- Python 3.10.6, pytest-7.2.0, pluggy-1.0.0 --
cachedir: .pytest_cache
plugins: hypothesis-6.56.3, cov-4.0.0
collected 11 items

test_cool_feature.py::test_dtypes[int8] PASSED
Expand Down Expand Up @@ -645,8 +647,7 @@ Performance matters and it is worth considering whether your code has introduced
performance regressions. *xarray* is starting to write a suite of benchmarking tests
using `asv <https://github.com/spacetelescope/asv>`__
to enable easy monitoring of the performance of critical *xarray* operations.
These benchmarks are all found in the ``xarray/asv_bench`` directory. asv
supports both python2 and python3.
These benchmarks are all found in the ``xarray/asv_bench`` directory.

To use all features of asv, you will need either ``conda`` or
``virtualenv``. For more details please check the `asv installation
Expand Down Expand Up @@ -699,7 +700,7 @@ environment by::

or, to use a specific Python interpreter,::

asv run -e -E existing:python3.6
asv run -e -E existing:python3.10

This will display stderr from the benchmarks, and use your local
``python`` that comes from your ``$PATH``.
Expand Down
6 changes: 3 additions & 3 deletions doc/getting-started-guide/installing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,10 @@ Installation
Required dependencies
---------------------

- Python (3.8 or later)
- `numpy <https://www.numpy.org/>`__ (1.20 or later)
- Python (3.9 or later)
- `numpy <https://www.numpy.org/>`__ (1.21 or later)
- `packaging <https://packaging.pypa.io/en/latest/#>`__ (21.3 or later)
- `pandas <https://pandas.pydata.org/>`__ (1.3 or later)
- `pandas <https://pandas.pydata.org/>`__ (1.4 or later)

.. _optional-dependencies:

Expand Down
15 changes: 15 additions & 0 deletions doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,21 @@ New Features
Breaking changes
~~~~~~~~~~~~~~~~

- Support for ``python 3.8`` has been dropped and the minimum versions of some
dependencies were changed (:pull:`7461`):

===================== ========= ========
Package Old New
===================== ========= ========
python 3.8 3.9
numpy 1.20 1.21
pandas 1.3 1.4
dask 2021.11 2022.1
distributed 2021.11 2022.1
h5netcdf 0.11 0.13
lxml 4.6 4.7
numba 5.4 5.5
===================== ========= ========

Deprecations
~~~~~~~~~~~~
Expand Down
4 changes: 2 additions & 2 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,6 @@
# it exists to let GitHub build the repository dependency graph
# https://help.github.com/en/github/visualizing-repository-data-with-graphs/listing-the-packages-that-a-repository-depends-on

numpy >= 1.20
numpy >= 1.21
packaging >= 21.3
pandas >= 1.3
pandas >= 1.4
7 changes: 3 additions & 4 deletions setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,6 @@ classifiers =
Intended Audience :: Science/Research
Programming Language :: Python
Programming Language :: Python :: 3
Programming Language :: Python :: 3.8
Programming Language :: Python :: 3.9
Programming Language :: Python :: 3.10
Programming Language :: Python :: 3.11
Expand All @@ -74,10 +73,10 @@ classifiers =
packages = find:
zip_safe = False # https://mypy.readthedocs.io/en/latest/installed_packages.html
include_package_data = True
python_requires = >=3.8
python_requires = >=3.9
install_requires =
numpy >= 1.20 # recommended to use >= 1.22 for full quantile method support
pandas >= 1.3
numpy >= 1.21 # recommended to use >= 1.22 for full quantile method support
pandas >= 1.4
packaging >= 21.3

[options.extras_require]
Expand Down
23 changes: 4 additions & 19 deletions xarray/backends/api.py
Original file line number Diff line number Diff line change
@@ -1,27 +1,12 @@
from __future__ import annotations

import os
from collections.abc import Hashable, Iterable, Mapping, MutableMapping, Sequence
from functools import partial
from glob import glob
from io import BytesIO
from numbers import Number
from typing import (
TYPE_CHECKING,
Any,
Callable,
Dict,
Final,
Hashable,
Iterable,
Literal,
Mapping,
MutableMapping,
Sequence,
Type,
Union,
cast,
overload,
)
from typing import TYPE_CHECKING, Any, Callable, Final, Literal, Union, cast, overload

import numpy as np

Expand Down Expand Up @@ -59,11 +44,11 @@
T_Engine = Union[
T_NetcdfEngine,
Literal["pydap", "pynio", "pseudonetcdf", "cfgrib", "zarr"],
Type[BackendEntrypoint],
type[BackendEntrypoint],
str, # no nice typing support for custom backends
None,
]
T_Chunks = Union[int, Dict[Any, Any], Literal["auto"], None]
T_Chunks = Union[int, dict[Any, Any], Literal["auto"], None]
T_NetcdfTypes = Literal[
"NETCDF4", "NETCDF4_CLASSIC", "NETCDF3_64BIT", "NETCDF3_CLASSIC"
]
Expand Down
Loading