-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Auto chunk #4064
Auto chunk #4064
Conversation
In my git clone, when I run the
I'm not sure why something has changed in these files (I haven't touched them), I also can't work out what the Could this somehow be associated with loads of the checks failing below? Thanks! :) |
@AndrewWilliams3142 This is due to flake8 applying new changes from pycodestyle. xarray-devs already dived into this, see #4057. You might just need to rebase with current master, to make the error go away. |
no need to rebase, simply merge Edit: the failing |
Okay cheers both! I'll have a look at these now. @keewis sorry I'm still getting used to using this side of Git at the moment, could you clarify what you mean by merge |
to merge $ git checkout master
$ git pull # synchronize master with `origin/master`
$ git checkout <feature-branch>
$ git merge master and if there are merge conflicts (i.e. the merge was interrupted), I follow the advice given by |
Okay, that makes sense. Though, it seems that I forked the master branch before @kmuehlbauer's commit, which fixed this flake8 issue? So I think I need to make a new fork? |
no, the If you need more explanations on Edit: the book even has a section on Github |
* FIX: correct dask array handling in _calc_idxminmax * FIX: remove unneeded import, reformat via black * fix idxmax, idxmin with dask arrays * FIX: use array[dim].data in `_calc_idxminmax` as per @keewis suggestion, attach dim name to result * ADD: add dask tests to `idxmin`/`idxmax` dataarray tests * FIX: add back fixture line removed by accident * ADD: complete dask handling in `idxmin`/`idxmax` tests in test_dataarray, xfail dask tests for dtype dateime64 (M) * ADD: add "support dask handling for idxmin/idxmax" in whats-new.rst * MIN: reintroduce changes added by #3953 * MIN: change if-clause to use `and` instead of `&` as per review-comment * MIN: change if-clause to use `and` instead of `&` as per review-comment * WIP: remove dask handling entirely for debugging purposes * Test for dask computes * WIP: re-add dask handling (map_blocks-approach), add `with raise_if_dask_computes()` context to idxmin-tests * Use dask indexing instead of map_blocks. * Better chunk choice. * Return -1 for _nan_argminmax_object if all NaNs along dim * Revert "Return -1 for _nan_argminmax_object if all NaNs along dim" This reverts commit 58901b9. * Raise error for object arrays * No error for object arrays. Instead expect 1 compute in tests. Co-authored-by: dcherian <[email protected]>
* rename d and l to dim and length
Added changes to whats-new.rst
Added changes to whats-new.rst
there's something wrong with the merge. Are you able to resolve that by yourself, or should I fix it for you? |
Do you mean the master merge? If that's wrong would you be able to fix it for me? My bad, hopefully i'll be able to do it more cleanly in future |
if you have any unpushed commits: could you push them now? While they would not be lost, it's way easier to handle them now than after the force-push |
No unpushed commits |
Hello @AndrewWilliams3142! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found: There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2020-05-21 17:16:39 UTC |
then you'll have to run For future reference, github does not work terribly well with rebases (or merges without a merge commit) so it would be good to avoid those. Edit: ignore the |
Okay so I've traced the error back to the Normally, when using the >>> def func(obj):
... result = obj + obj.x + 5 * obj.y
... return result
...
>>> xr.map_blocks(func, ds).unify_chunks().chunks
Frozen(SortedKeysDict({'x': (4, 4, 2), 'y': (5, 5, 5, 5), 'z': (4,)}))
>>> func(ds).chunk().unify_chunks().chunks
Frozen(SortedKeysDict({'x': (4, 4, 2), 'y': (5, 5, 5, 5), 'z': (4,)})) However, when I use the changes I've made to >>> xr.map_blocks(func, ds).unify_chunks().chunks
Frozen(SortedKeysDict({'x': (4, 4, 2), 'y': (5, 5, 5, 5), 'z': (4,)}))
>>> func(ds).chunk().unify_chunks().chunks
Frozen(SortedKeysDict({'x': (10,), 'y': (20,), 'z': (4,)})) Which means that it now fails the I've tried to follow through the code and see what is actually happening when this change is made, but I'm out of my depth here. My guess is that Edit: I think that's the problem! >>> isinstance(None, numbers.Number)
False
>>> is_scalar(None)
True I'll add in something to catch |
ah right. this is now rechunking chunked objects to a single chunk when |
@dcherian do you have any idea about this
Edit: thanks to everyone for your help so far! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You might also have to change
Line 1701 in 2542a63
None, Number, Mapping[Hashable, Union[None, Number, Tuple[Number, ...]]] |
to
None, Number, str, Mapping[Hashable, Union[None, Number, str, Tuple[Number, ...]]]
Thanks @AndrewWilliams3142 . We should add a test for |
Cheers! I forgot about the tests, will add them this week or next hopefully |
@dcherian Thanks for the tip:) Quick question: Is there a reason why you're specifying the |
|
@keewis thanks for this! I've added what I think is a suitable test for |
Thanks @AndrewWilliams3142 |
No problem ! Thanks everyone for helping me get up to speed :) |
* upstream/master: Improve interp performance (pydata#4069) Auto chunk (pydata#4064) xr.cov() and xr.corr() (pydata#4089) allow multiindex levels in plots (pydata#3938) Fix bool weights (pydata#4075) fix dangerous default arguments (pydata#4006)
Adding
chunks='auto'
option toDataset.chunk()
.test_dask.py
isort -rc . && black . && mypy . && flake8
whats-new.rst
for changes