You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It has unlimited dimension of 'time' with current length 5:
In [10]: ds.dimensions
Out[10]:
OrderedDict([('veg_class',
<class 'netCDF4._netCDF4.Dimension'>: name = 'veg_class', size = 19),
('lat',
<class 'netCDF4._netCDF4.Dimension'>: name = 'lat', size = 160),
('lon',
<class 'netCDF4._netCDF4.Dimension'>: name = 'lon', size = 160),
('time',
<class 'netCDF4._netCDF4.Dimension'> (unlimited): name = 'time', size = 5)])
However, the value of Variable.chunking() is 2 ** 20:
In [7]: ds.variables['time']
Out[7]:
<class 'netCDF4._netCDF4.Variable'>
int32 time(time)
units: days since 2000-01-01 00:00:00.0
unlimited dimensions: time
current shape = (5,)
filling on, default _FillValue of -2147483647 used
In [8]: ds.variables['time'].chunking()
Out[8]: [1048576]
This results in the error "ValueError: chunksize cannot exceed dimension size" when attempting to write a new Variable with chunksizes equal to its chunking.
It would be nice if netCDF4-Python offered the guarantee that all read chunksizes were valid chunksizes for writing, perhaps by truncating larger chunksizes.
The text was updated successfully, but these errors were encountered:
I guess you are trying to write a new variable in which the time dimension is not unlimited? Apparently only unlimited dimensions can have chunksizes larger their (current) size.
For example, consider the netCDF attached to this comment:
pydata/xarray#1225 (comment)
It has unlimited dimension of
'time'
with current length 5:However, the value of
Variable.chunking()
is 2 ** 20:This results in the error "ValueError: chunksize cannot exceed dimension size" when attempting to write a new Variable with
chunksizes
equal to its chunking.It would be nice if netCDF4-Python offered the guarantee that all read chunksizes were valid chunksizes for writing, perhaps by truncating larger chunksizes.
The text was updated successfully, but these errors were encountered: