-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dataset.__repr__ computes dask variables #1522
Comments
I think we should probably just print the placeholders for dask arrays in all cases. The current behavior should require calling a |
working on this now |
This is in Jupyter:
Output:
YIKES! |
There is a separate problem where index coords are computed twice. Didn't fix it yet and I am afraid of a domino effect. The problem is in merge.py:merge_coords():
Here, both expand_variable_dicts() and _get_priority_vars() compute the dask array. |
Index coordinates (i.e., If they are getting computed via |
Given this:
This correctly computes the coord once:
While this computes it twice:
|
Hmm. My guess (untested) is that |
Travis is failing in a few environments, but I just tested that I get the exact same errors in the master branch |
* Load non-index coords to memory ahead of concat * Update unit test after #1522 * Minimise loads on concat. Extend new concat logic to data_vars. * Trivial tweaks * Added unit tests Fix loads when vars are found different halfway through * Add xfail for #1586 * Revert "Add xfail for #1586" This reverts commit f99313c.
DataArray.__repr__ and Variable.__repr__ print a placeholder if the data uses the dask backend.
Not so Dataset.__repr__, which tries computing the data before printing a tiny preview of it.
This issue is extremely annoying when working in Jupyter, and particularly acute if the chunks are very big or are at the end of a very long chain of computation.
For data variables, the expected behaviour is to print a placeholder just like DataArray does.
For coords, we could either
The text was updated successfully, but these errors were encountered: