You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In attempting to concatenate many datasets along a large dimension (total size ~100,000,000) I'm finding very slow performance, e.g., tens of seconds just to concatenate two datasets.
With some profiling, I find all the time is being spend in this list comprehension:
I don't know exactly what's going on here, but it doesn't look right - e.g., if the size of the dimension to be concatenated is large, this list comprehension can run millions of loops, which doesn't seem related to the intended behaviour.
Sorry I don't have an MRE for this yet but please let me know if I can help further.
The text was updated successfully, but these errors were encountered:
What is your issue?
In attempting to concatenate many datasets along a large dimension (total size ~100,000,000) I'm finding very slow performance, e.g., tens of seconds just to concatenate two datasets.
With some profiling, I find all the time is being spend in this list comprehension:
xarray/xarray/core/concat.py
Line 584 in 51554f2
I don't know exactly what's going on here, but it doesn't look right - e.g., if the size of the dimension to be concatenated is large, this list comprehension can run millions of loops, which doesn't seem related to the intended behaviour.
Sorry I don't have an MRE for this yet but please let me know if I can help further.
The text was updated successfully, but these errors were encountered: