Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix mypy errors in xarray.py, xrutils.py, cache.py #144

Merged
merged 49 commits into from
Sep 23, 2022
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
Show all changes
49 commits
Select commit Hold shift + click to select a range
c972e97
update dim typing
Illviljan Sep 17, 2022
2e42456
Merge branch 'main' into dim_typing
Illviljan Sep 19, 2022
64c7d77
Fix mypy errors in xarray.py
Illviljan Sep 19, 2022
b3d698a
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Sep 19, 2022
6e4db03
start mypy ci
Illviljan Sep 19, 2022
afee7c4
Merge branch 'dim_typing' of https://github.com/Illviljan/flox into d…
Illviljan Sep 19, 2022
ed752dd
Use T_DataArray and T_Dataset
Illviljan Sep 19, 2022
6303f4a
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Sep 19, 2022
ae8953a
Add mypy ignores
Illviljan Sep 19, 2022
8fba166
Merge branch 'dim_typing' of https://github.com/Illviljan/flox into d…
Illviljan Sep 19, 2022
ae5561d
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Sep 19, 2022
5145dc2
correct typing a bit
Illviljan Sep 19, 2022
5d46140
Merge branch 'dim_typing' of https://github.com/Illviljan/flox into d…
Illviljan Sep 19, 2022
05893a2
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Sep 19, 2022
375c31b
test newer flake8 if ellipsis passes there
Illviljan Sep 19, 2022
6ba6da4
Merge branch 'dim_typing' of https://github.com/Illviljan/flox into d…
Illviljan Sep 19, 2022
170467b
Allow ellipsis in flake8
Illviljan Sep 19, 2022
a3d63a2
Update core.py
Illviljan Sep 19, 2022
cf0d6cd
Update xarray.py
Illviljan Sep 20, 2022
bde6c52
Merge branch 'main' into dim_typing
Illviljan Sep 20, 2022
3728858
Update setup.cfg
Illviljan Sep 20, 2022
657496d
Update xarray.py
Illviljan Sep 20, 2022
68ac242
Update xarray.py
Illviljan Sep 20, 2022
c306099
Update xarray.py
Illviljan Sep 20, 2022
90b0149
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Sep 20, 2022
332caf9
Update xarray.py
Illviljan Sep 20, 2022
9740009
Update pyproject.toml
Illviljan Sep 20, 2022
5c08114
Update xarray.py
Illviljan Sep 20, 2022
21b641d
Merge branch 'main' into dim_typing
Illviljan Sep 20, 2022
d5409ef
Update xarray.py
Illviljan Sep 20, 2022
1accd73
hopefully no more pytest errors.
Illviljan Sep 20, 2022
a50bb6b
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Sep 20, 2022
50c2ac2
make sure expected_groups doesn't have None
Illviljan Sep 20, 2022
db2ac1b
Merge branch 'dim_typing' of https://github.com/Illviljan/flox into d…
Illviljan Sep 20, 2022
1921938
Update flox/xarray.py
Illviljan Sep 20, 2022
3cac4b0
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Sep 20, 2022
43dabff
ds_broad and longer comment
Illviljan Sep 20, 2022
e73f6e8
Use same for loop for similar things.
Illviljan Sep 21, 2022
2d62748
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Sep 21, 2022
62cc554
Merge pull request #31 from xarray-contrib/main
Illviljan Sep 21, 2022
41e97e9
fix xrutils.py
Illviljan Sep 21, 2022
fc36211
fix errors in cache.py
Illviljan Sep 21, 2022
a5d41a5
Merge branch 'main' into dim_typing
Illviljan Sep 21, 2022
bfb9c6e
Turn off mypy check
Illviljan Sep 21, 2022
7260660
Update flox/xarray.py
Illviljan Sep 22, 2022
b34c268
Update flox/xarray.py
Illviljan Sep 22, 2022
eaf93d2
Use if else format to avoid tuple creation
Illviljan Sep 22, 2022
9486184
Update xarray.py
Illviljan Sep 22, 2022
b18d209
Merge branch 'main' into dim_typing
Illviljan Sep 22, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
80 changes: 40 additions & 40 deletions .github/workflows/ci-additional.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -73,46 +73,46 @@ jobs:
run: |
python -m pytest --doctest-modules flox --ignore flox/tests

# mypy:
# name: Mypy
# runs-on: "ubuntu-latest"
# needs: detect-ci-trigger
# if: needs.detect-ci-trigger.outputs.triggered == 'false'
# defaults:
# run:
# shell: bash -l {0}
# env:
# CONDA_ENV_FILE: ci/environment.yml
# PYTHON_VERSION: "3.10"
mypy:
name: Mypy
runs-on: "ubuntu-latest"
needs: detect-ci-trigger
if: needs.detect-ci-trigger.outputs.triggered == 'false'
defaults:
run:
shell: bash -l {0}
env:
CONDA_ENV_FILE: ci/environment.yml
PYTHON_VERSION: "3.10"

# steps:
# - uses: actions/checkout@v3
# with:
# fetch-depth: 0 # Fetch all history for all branches and tags.
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0 # Fetch all history for all branches and tags.

# - name: set environment variables
# run: |
# echo "TODAY=$(date +'%Y-%m-%d')" >> $GITHUB_ENV
# - name: Setup micromamba
# uses: mamba-org/provision-with-micromamba@34071ca7df4983ccd272ed0d3625818b27b70dcc
# with:
# environment-file: ${{env.CONDA_ENV_FILE}}
# environment-name: xarray-tests
# extra-specs: |
# python=${{env.PYTHON_VERSION}}
# cache-env: true
# cache-env-key: "${{runner.os}}-${{runner.arch}}-py${{env.PYTHON_VERSION}}-${{env.TODAY}}-${{hashFiles(env.CONDA_ENV_FILE)}}"
# - name: Install xarray
# run: |
# python -m pip install --no-deps -e .
# - name: Version info
# run: |
# conda info -a
# conda list
# - name: Install mypy
# run: |
# python -m pip install mypy
- name: set environment variables
run: |
echo "TODAY=$(date +'%Y-%m-%d')" >> $GITHUB_ENV
- name: Setup micromamba
uses: mamba-org/provision-with-micromamba@34071ca7df4983ccd272ed0d3625818b27b70dcc
with:
environment-file: ${{env.CONDA_ENV_FILE}}
environment-name: xarray-tests
extra-specs: |
python=${{env.PYTHON_VERSION}}
cache-env: true
cache-env-key: "${{runner.os}}-${{runner.arch}}-py${{env.PYTHON_VERSION}}-${{env.TODAY}}-${{hashFiles(env.CONDA_ENV_FILE)}}"
- name: Install xarray
run: |
python -m pip install --no-deps -e .
- name: Version info
run: |
conda info -a
conda list
- name: Install mypy
run: |
python -m pip install mypy

# - name: Run mypy
# run: |
# python -m mypy --install-types --non-interactive
- name: Run mypy
run: |
python -m mypy --install-types --non-interactive
4 changes: 2 additions & 2 deletions flox/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -1282,11 +1282,11 @@ def _assert_by_is_aligned(shape, by):


def _convert_expected_groups_to_index(
expected_groups: tuple, isbin: bool, sort: bool
expected_groups: tuple, isbin: Sequence[bool], sort: bool
) -> pd.Index | None:
out = []
for ex, isbin_ in zip(expected_groups, isbin):
if isinstance(ex, pd.IntervalIndex) or (isinstance(ex, pd.Index) and not isbin):
if isinstance(ex, pd.IntervalIndex) or (isinstance(ex, pd.Index) and not isbin_):
if sort:
ex = ex.sort_values()
out.append(ex)
Expand Down
106 changes: 55 additions & 51 deletions flox/xarray.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
from __future__ import annotations

from typing import TYPE_CHECKING, Hashable, Iterable, Sequence
from typing import TYPE_CHECKING, Hashable, Iterable, Sequence, Union

import numpy as np
import pandas as pd
Expand All @@ -19,7 +19,10 @@
from .xrutils import _contains_cftime_datetimes, _to_pytimedelta, datetime_to_numeric

if TYPE_CHECKING:
from xarray import DataArray, Dataset, Resample
from xarray import DataArray, Dataset # TODO: Use T_DataArray, T_Dataset?
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, we should explicitly say that xarray.types (is that right?) is public somewhere on the xarray docs.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's xarray.core.types so I suppose it's technically private at the moment. Maybe for the better? I don't think .types has settled enough yet to start recommending to the larger audience. Doesn't stop us from using it early though! :)

I mainly wrote the ToDo because I had issues with mypy, but this was the solution:

# This errors if obj: T_Dataset | T_DataArray.
    if isinstance(obj, xr.DataArray):
        ds = obj._to_temp_dataset()
    else:
        ds = obj

# This passes if obj: T_Dataset | T_DataArray.
    if isinstance(obj, xr.Dataset):
        ds = obj
    else:
        ds = obj._to_temp_dataset()

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great this would be a good issue to open over at xarray

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The other reason this is fine is that I'd like to move the contents of this file over to xarray in the long term.

from xarray.core.resample import Resample

Dims = Union[str, Iterable[Hashable], None]
Illviljan marked this conversation as resolved.
Show resolved Hide resolved


def _get_input_core_dims(group_names, dim, ds, grouper_dims):
Expand Down Expand Up @@ -52,12 +55,12 @@ def lookup_order(dimension):

def xarray_reduce(
obj: Dataset | DataArray,
*by: DataArray | Iterable[str] | Iterable[DataArray],
*by: DataArray | Hashable,
dcherian marked this conversation as resolved.
Show resolved Hide resolved
func: str | Aggregation,
expected_groups=None,
isbin: bool | Sequence[bool] = False,
sort: bool = True,
dim: Hashable = None,
dim: Dims | ellipsis = None,
dcherian marked this conversation as resolved.
Show resolved Hide resolved
split_out: int = 1,
fill_value=None,
method: str = "map-reduce",
Expand Down Expand Up @@ -206,8 +209,10 @@ def xarray_reduce(
if keep_attrs is None:
keep_attrs = True

if isinstance(isbin, bool):
isbin = (isbin,) * len(by)
if isinstance(isbin, Sequence):
isbins = isbin
else:
isbins = (isbin,) * len(by)
if expected_groups is None:
expected_groups = (None,) * len(by)
if isinstance(expected_groups, (np.ndarray, list)): # TODO: test for list
Expand All @@ -220,17 +225,17 @@ def xarray_reduce(
raise NotImplementedError

# eventually drop the variables we are grouping by
maybe_drop = [b for b in by if isinstance(b, str)]
maybe_drop = [b for b in by if isinstance(b, Hashable)]
unindexed_dims = tuple(
b
for b, isbin_ in zip(by, isbin)
if isinstance(b, str) and not isbin_ and b in obj.dims and b not in obj.indexes
for b, isbin_ in zip(by, isbins)
if isinstance(b, Hashable) and not isbin_ and b in obj.dims and b not in obj.indexes
)

by: tuple[DataArray] = tuple(obj[g] if isinstance(g, str) else g for g in by) # type: ignore
dcherian marked this conversation as resolved.
Show resolved Hide resolved
by_da = tuple(obj[g] if isinstance(g, Hashable) else g for g in by)

grouper_dims = []
for g in by:
for g in by_da:
for d in g.dims:
if d not in grouper_dims:
grouper_dims.append(d)
Expand All @@ -243,53 +248,52 @@ def xarray_reduce(
ds = ds.drop_vars([var for var in maybe_drop if var in ds.variables])

if dim is Ellipsis:
dim = tuple(obj.dims)
if by[0].name in ds.dims and not isbin[0]:
dim = tuple(d for d in dim if d != by[0].name)
dim_tuple = tuple(obj.dims)
Illviljan marked this conversation as resolved.
Show resolved Hide resolved
if by_da[0].name in ds.dims and not isbins[0]:
dim_tuple = tuple(d for d in dim_tuple if d != by_da[0].name)
Illviljan marked this conversation as resolved.
Show resolved Hide resolved
elif dim is not None:
dim = _atleast_1d(dim)
dim_tuple = _atleast_1d(dim)
else:
dim = tuple()
dim_tuple = tuple()

# broadcast all variables against each other along all dimensions in `by` variables
# don't exclude `dim` because it need not be a dimension in any of the `by` variables!
# in the case where dim is Ellipsis, and by.ndim < obj.ndim
# then we also broadcast `by` to all `obj.dims`
# TODO: avoid this broadcasting
exclude_dims = tuple(d for d in ds.dims if d not in grouper_dims and d not in dim)
ds, *by = xr.broadcast(ds, *by, exclude=exclude_dims)
exclude_dims = tuple(d for d in ds.dims if d not in grouper_dims and d not in dim_tuple)
ds_broad, *by_broad = xr.broadcast(ds, *by_da, exclude=exclude_dims)

if not dim:
dim = tuple(by[0].dims)
if not dim_tuple:
Illviljan marked this conversation as resolved.
Show resolved Hide resolved
dim_tuple = tuple(by_broad[0].dims)

if any(d not in grouper_dims and d not in obj.dims for d in dim):
raise ValueError(f"Cannot reduce over absent dimensions {dim}.")
if any(d not in grouper_dims and d not in obj.dims for d in dim_tuple):
raise ValueError(f"Cannot reduce over absent dimensions {dim_tuple}.")

dims_not_in_groupers = tuple(d for d in dim if d not in grouper_dims)
if dims_not_in_groupers == tuple(dim) and not any(isbin):
dims_not_in_groupers = tuple(d for d in dim_tuple if d not in grouper_dims)
if dims_not_in_groupers == dim_tuple and not any(isbins):
# reducing along a dimension along which groups do not vary
# This is really just a normal reduction.
# This is not right when binning so we exclude.
if skipna and isinstance(func, str):
dsfunc = func[3:]
else:
dsfunc = func
if isinstance(func, str):
dsfunc = func[3:] if skipna else func
# TODO: skipna needs test
result = getattr(ds, dsfunc)(dim=dim, skipna=skipna)
result = getattr(ds_broad, dsfunc)(dim=dim_tuple, skipna=skipna)
if isinstance(obj, xr.DataArray):
return obj._from_temp_dataset(result)
else:
return result

axis = tuple(range(-len(dim), 0))
group_names = tuple(g.name if not binned else f"{g.name}_bins" for g, binned in zip(by, isbin))
axis = tuple(range(-len(dim_tuple), 0))
group_names = tuple(
g.name if not binned else f"{g.name}_bins" for g, binned in zip(by_broad, isbins)
)

group_shape = [None] * len(by)
expected_groups = list(expected_groups)

# Set expected_groups and convert to index since we need coords, sizes
# for output xarray objects
for idx, (b, expect, isbin_) in enumerate(zip(by, expected_groups, isbin)):
for idx, (b_, expect, isbin_) in enumerate(zip(by_broad, expected_groups, isbins)):
if isbin_ and isinstance(expect, int):
raise NotImplementedError(
"flox does not support binning into an integer number of bins yet."
Expand All @@ -300,9 +304,9 @@ def xarray_reduce(
f"Please provided bin edges for group variable {idx} "
f"named {group_names[idx]} in expected_groups."
)
expected_groups[idx] = _get_expected_groups(b.data, sort=sort, raise_if_dask=True)
expected_groups[idx] = _get_expected_groups(b_.data, sort=sort, raise_if_dask=True)

expected_groups = _convert_expected_groups_to_index(expected_groups, isbin, sort=sort)
expected_groups = _convert_expected_groups_to_index(expected_groups, isbins, sort=sort)
group_shape = tuple(len(e) for e in expected_groups)
Illviljan marked this conversation as resolved.
Show resolved Hide resolved
group_sizes = dict(zip(group_names, group_shape))

Expand Down Expand Up @@ -350,20 +354,20 @@ def wrapper(array, *by, func, skipna, **kwargs):
if isinstance(obj, xr.Dataset):
# broadcasting means the group dim gets added to ds, so we check the original obj
for k, v in obj.data_vars.items():
is_missing_dim = not (any(d in v.dims for d in dim))
is_missing_dim = not (any(d in v.dims for d in dim_tuple))
if is_missing_dim:
missing_dim[k] = v

input_core_dims = _get_input_core_dims(group_names, dim, ds, grouper_dims)
input_core_dims += [input_core_dims[-1]] * (len(by) - 1)
input_core_dims = _get_input_core_dims(group_names, dim_tuple, ds_broad, grouper_dims)
input_core_dims += [input_core_dims[-1]] * (len(by_broad) - 1)

actual = xr.apply_ufunc(
wrapper,
ds.drop_vars(tuple(missing_dim)).transpose(..., *grouper_dims),
*by,
ds_broad.drop_vars(tuple(missing_dim)).transpose(..., *grouper_dims),
*by_broad,
input_core_dims=input_core_dims,
# for xarray's test_groupby_duplicate_coordinate_labels
exclude_dims=set(dim),
exclude_dims=set(dim_tuple),
output_core_dims=[group_names],
dask="allowed",
dask_gufunc_kwargs=dict(output_sizes=group_sizes),
Expand All @@ -380,27 +384,27 @@ def wrapper(array, *by, func, skipna, **kwargs):
"engine": engine,
"reindex": reindex,
"expected_groups": tuple(expected_groups),
"isbin": isbin,
"isbin": isbins,
"finalize_kwargs": finalize_kwargs,
},
)

# restore non-dim coord variables without the core dimension
# TODO: shouldn't apply_ufunc handle this?
for var in set(ds.variables) - set(ds.dims):
if all(d not in ds[var].dims for d in dim):
actual[var] = ds[var]
for var in set(ds_broad.variables) - set(ds_broad.dims):
if all(d not in ds_broad[var].dims for d in dim_tuple):
actual[var] = ds_broad[var]

for name, expect, by_ in zip(group_names, expected_groups, by):
for name, expect, by_ in zip(group_names, expected_groups, by_broad):
# Can't remove this till xarray handles IntervalIndex
if isinstance(expect, pd.IntervalIndex):
expect = expect.to_numpy()
if isinstance(actual, xr.Dataset) and name in actual:
actual = actual.drop_vars(name)
# When grouping by MultiIndex, expect is an pd.Index wrapping
# an object array of tuples
if name in ds.indexes and isinstance(ds.indexes[name], pd.MultiIndex):
levelnames = ds.indexes[name].names
if name in ds_broad.indexes and isinstance(ds_broad.indexes[name], pd.MultiIndex):
levelnames = ds_broad.indexes[name].names
expect = pd.MultiIndex.from_tuples(expect.values, names=levelnames)
actual[name] = expect
if Version(xr.__version__) > Version("2022.03.0"):
Expand All @@ -413,19 +417,19 @@ def wrapper(array, *by, func, skipna, **kwargs):
if unindexed_dims:
actual = actual.drop_vars(unindexed_dims)

if len(by) == 1:
if len(by_broad) == 1:
for var in actual:
if isinstance(obj, xr.DataArray):
template = obj
else:
template = obj[var]
if actual[var].ndim > 1:
actual[var] = _restore_dim_order(actual[var], template, by[0])
actual[var] = _restore_dim_order(actual[var], template, by_broad[0])

if missing_dim:
for k, v in missing_dim.items():
missing_group_dims = {
dim: size for dim, size in group_sizes.items() if dim not in v.dims
dcherian marked this conversation as resolved.
Show resolved Hide resolved
dim_: size for dim_, size in group_sizes.items() if dim_ not in v.dims
}
# The expand_dims is for backward compat with xarray's questionable behaviour
if missing_group_dims:
Expand Down