You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This wasn't entirely surprising to me, but nbytes currently doesn't return the right value for sparse data -- at least, I think nbytes should show the actual size in memory?
Rather than something like data.nnz, which of course only exists for sparse arrays...
I'm not sure if there's a sparse flag or something, or whether you'd have to do a typecheck?
I guess it could be possible to do a typecheck for this case. It's tricky to have a smart nbytes that works in all cases, though. For example, it's not really possible to get this information for dask arrays with sparse chunks (dask/dask#5313).
This wasn't entirely surprising to me, but
nbytes
currently doesn't return the right value for sparse data -- at least, I think nbytes should show the actual size in memory?Since it uses
size
here:xarray/xarray/core/variable.py
Line 349 in a0c71c1
Rather than something like
data.nnz
, which of course only exists for sparse arrays...I'm not sure if there's a sparse flag or something, or whether you'd have to do a typecheck?
Minimal Complete Verifiable Example:
8000000000
Anything else we need to know?:
Environment:
Output of xr.show_versions()
The text was updated successfully, but these errors were encountered: