-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* map_block attempt 2 * Address reviews: errors, args + kwargs support. * Works with datasets! * remove wrong comment. * Support chunks. * infer template. * cleanup * cleanup2 * api.rst * simple shape change error check. * Make test more complicated. * Fix for when user function doesn't set DataArray.name * Now _to_temp_dataset works. * Add whats-new * chunks kwarg makes no sense right now. * review feedback: 1. skip index graph nodes. 2. var → name 3. quicker dataarray creation. 4. Add restrictions to docstring. 5. rename chunk construction task. 6. error when non-xarray object is returned. 7. restore non-coord dims. review * Support nondim coords in make_meta. * Add Dataset.unify_chunks * doc updates. * minor. * update comment. * More complicated test dataset. Tests fail :X * Don't know why compute is needed. * work with DataArray nondim coords. * fastpath unify_chunks * comment. * much improved tests. * Change args, kwargs syntax. * Add dataset, dataarray methods. * api.rst * docstrings. * Fix unify_chunks. * Move assert_chunks_equal to xarray.testing. * minor changes. * Better error handling when inferring returned object * wip * wip * better to_array * Docstrings + nicer error message. * remove unify_chunks in map_blocks + better tests. * typing for unify_chunks * address more review comments. * more unify_chunks tests. * Just use dask.core.utils.meta_from_array * get tests working. assert_equal needs a lot of fixing. * more unify_chunks test. * assert_chunks_equal fixes. * copy over meta_from_array. * minor fixes. * raise chunks error earlier and test for map_blocks raising chunk error * fix. * Type annotations * py35 compat * make sure unify_chunks does not compute. * Make tests functional by call compute before assert_equal * Update whats-new * Work with attributes. * Support attrs and name changes. * more assert_equal * test changing coord attribute * fix whats new * rework tests to use fixtures (kind of) * more review changes. * cleanup * more review feedback. * fix unify_chunks. * read dask_array_compat :) * Dask 1.2.0 compat. * documentation polish * make_meta reflow * cosmetic * polish * Fix tests * isort * isort * Add func call to docstrings.
- Loading branch information
1 parent
291cb80
commit 3f29551
Showing
15 changed files
with
910 additions
and
26 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,91 @@ | ||
from distutils.version import LooseVersion | ||
|
||
import dask.array as da | ||
import numpy as np | ||
from dask import __version__ as dask_version | ||
|
||
if LooseVersion(dask_version) >= LooseVersion("2.0.0"): | ||
meta_from_array = da.utils.meta_from_array | ||
else: | ||
# Copied from dask v2.4.0 | ||
# Used under the terms of Dask's license, see licenses/DASK_LICENSE. | ||
import numbers | ||
|
||
def meta_from_array(x, ndim=None, dtype=None): | ||
""" Normalize an array to appropriate meta object | ||
Parameters | ||
---------- | ||
x: array-like, callable | ||
Either an object that looks sufficiently like a Numpy array, | ||
or a callable that accepts shape and dtype keywords | ||
ndim: int | ||
Number of dimensions of the array | ||
dtype: Numpy dtype | ||
A valid input for ``np.dtype`` | ||
Returns | ||
------- | ||
array-like with zero elements of the correct dtype | ||
""" | ||
# If using x._meta, x must be a Dask Array, some libraries (e.g. zarr) | ||
# implement a _meta attribute that are incompatible with Dask Array._meta | ||
if hasattr(x, "_meta") and isinstance(x, da.Array): | ||
x = x._meta | ||
|
||
if dtype is None and x is None: | ||
raise ValueError("You must specify the meta or dtype of the array") | ||
|
||
if np.isscalar(x): | ||
x = np.array(x) | ||
|
||
if x is None: | ||
x = np.ndarray | ||
|
||
if isinstance(x, type): | ||
x = x(shape=(0,) * (ndim or 0), dtype=dtype) | ||
|
||
if ( | ||
not hasattr(x, "shape") | ||
or not hasattr(x, "dtype") | ||
or not isinstance(x.shape, tuple) | ||
): | ||
return x | ||
|
||
if isinstance(x, list) or isinstance(x, tuple): | ||
ndims = [ | ||
0 | ||
if isinstance(a, numbers.Number) | ||
else a.ndim | ||
if hasattr(a, "ndim") | ||
else len(a) | ||
for a in x | ||
] | ||
a = [a if nd == 0 else meta_from_array(a, nd) for a, nd in zip(x, ndims)] | ||
return a if isinstance(x, list) else tuple(x) | ||
|
||
if ndim is None: | ||
ndim = x.ndim | ||
|
||
try: | ||
meta = x[tuple(slice(0, 0, None) for _ in range(x.ndim))] | ||
if meta.ndim != ndim: | ||
if ndim > x.ndim: | ||
meta = meta[ | ||
(Ellipsis,) + tuple(None for _ in range(ndim - meta.ndim)) | ||
] | ||
meta = meta[tuple(slice(0, 0, None) for _ in range(meta.ndim))] | ||
elif ndim == 0: | ||
meta = meta.sum() | ||
else: | ||
meta = meta.reshape((0,) * ndim) | ||
except Exception: | ||
meta = np.empty((0,) * ndim, dtype=dtype or x.dtype) | ||
|
||
if np.isscalar(meta): | ||
meta = np.array(meta) | ||
|
||
if dtype and meta.dtype != dtype: | ||
meta = meta.astype(dtype) | ||
|
||
return meta |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.