Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OSError: Unable to read attribute (No appropriate function for conversion path) #16

Closed
mangecoeur opened this issue Jun 7, 2016 · 7 comments

Comments

@mangecoeur
Copy link

I'm trying to open a netCDF4 file created by Xarray (made by opening a netCDF3 file and re-saving as netCDF4 using `.to_netcdf) and I get the error below.

import h5netcdf

import h5py as h5
with h5netcdf.File('test.nc', 'r') as fd:
    print(fd['TMP_L103'])
---------------------------------------------------------------------------
OSError                                   Traceback (most recent call last)
<ipython-input-2-f423ec5ccbc8> in <module>()
----> 1 dset2 = xr.open_dataset('test.nc', engine='h5netcdf')

/Users/jonathanchambers/anaconda/lib/python3.5/site-packages/xarray/backends/api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, concat_characters, decode_coords, engine, chunks, lock, drop_variables)
    225             lock = _default_lock(filename_or_obj, engine)
    226         with close_on_error(store):
--> 227             return maybe_decode_store(store, lock)
    228     else:
    229         if engine is not None and engine != 'scipy':

/Users/jonathanchambers/anaconda/lib/python3.5/site-packages/xarray/backends/api.py in maybe_decode_store(store, lock)
    156             store, mask_and_scale=mask_and_scale, decode_times=decode_times,
    157             concat_characters=concat_characters, decode_coords=decode_coords,
--> 158             drop_variables=drop_variables)
    159 
    160         if chunks is not None:

/Users/jonathanchambers/anaconda/lib/python3.5/site-packages/xarray/conventions.py in decode_cf(obj, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables)
    880         file_obj = obj._file_obj
    881     elif isinstance(obj, AbstractDataStore):
--> 882         vars, attrs = obj.load()
    883         extra_coords = set()
    884         file_obj = obj

/Users/jonathanchambers/anaconda/lib/python3.5/site-packages/xarray/backends/common.py in load(self)
    112         """
    113         variables = FrozenOrderedDict((_decode_variable_name(k), v)
--> 114                                       for k, v in iteritems(self.get_variables()))
    115         attributes = FrozenOrderedDict(self.get_attrs())
    116         return variables, attributes

/Users/jonathanchambers/anaconda/lib/python3.5/site-packages/xarray/backends/h5netcdf_.py in get_variables(self)
     68     def get_variables(self):
     69         return FrozenOrderedDict((k, self.open_store_variable(v))
---> 70                                  for k, v in iteritems(self.ds.variables))
     71 
     72     def get_attrs(self):

/Users/jonathanchambers/anaconda/lib/python3.5/site-packages/xarray/core/utils.py in FrozenOrderedDict(*args, **kwargs)
    274 
    275 def FrozenOrderedDict(*args, **kwargs):
--> 276     return Frozen(OrderedDict(*args, **kwargs))
    277 
    278 

/Users/jonathanchambers/anaconda/lib/python3.5/site-packages/xarray/backends/h5netcdf_.py in <genexpr>(.0)
     68     def get_variables(self):
     69         return FrozenOrderedDict((k, self.open_store_variable(v))
---> 70                                  for k, v in iteritems(self.ds.variables))
     71 
     72     def get_attrs(self):

/Users/jonathanchambers/anaconda/lib/python3.5/site-packages/xarray/backends/h5netcdf_.py in open_store_variable(self, var)
     53         dimensions = var.dimensions
     54         data = indexing.LazilyIndexedArray(var)
---> 55         attrs = _read_attributes(var)
     56 
     57         # netCDF4 specific encoding

/Users/jonathanchambers/anaconda/lib/python3.5/site-packages/xarray/backends/h5netcdf_.py in _read_attributes(h5netcdf_var)
     24     attrs = OrderedDict()
     25     for k in h5netcdf_var.ncattrs():
---> 26         v = h5netcdf_var.getncattr(k)
     27         if k not in ['_FillValue', 'missing_value']:
     28             v = maybe_decode_bytes(v)

/Users/jonathanchambers/anaconda/lib/python3.5/site-packages/h5netcdf/legacyapi.py in getncattr(self, name)
      6 
      7     def getncattr(self, name):
----> 8         return self.attrs[name]
      9 
     10     def setncattr(self, name, value):

/Users/jonathanchambers/anaconda/lib/python3.5/site-packages/h5netcdf/attrs.py in __getitem__(self, key)
     14         if key in _hidden_attrs:
     15             raise KeyError(key)
---> 16         return self._h5attrs[key]
     17 
     18     def __setitem__(self, key, value):

h5py/_objects.pyx in h5py._objects.with_phil.wrapper (-------src-dir--------/h5py/_objects.c:2582)()

h5py/_objects.pyx in h5py._objects.with_phil.wrapper (-------src-dir--------/h5py/_objects.c:2541)()

/Users/jonathanchambers/anaconda/lib/python3.5/site-packages/h5py/_hl/attrs.py in __getitem__(self, name)
     77 
     78         arr = numpy.ndarray(shape, dtype=dtype, order='C')
---> 79         attr.read(arr, mtype=htype)
     80 
     81         if len(arr.shape) == 0:

h5py/_objects.pyx in h5py._objects.with_phil.wrapper (-------src-dir--------/h5py/_objects.c:2582)()

h5py/_objects.pyx in h5py._objects.with_phil.wrapper (-------src-dir--------/h5py/_objects.c:2541)()

h5py/h5a.pyx in h5py.h5a.AttrID.read (-------src-dir--------/h5py/h5a.c:5123)()

h5py/_proxy.pyx in h5py._proxy.attr_rw (-------src-dir--------/h5py/_proxy.c:915)()

OSError: Unable to read attribute (No appropriate function for conversion path)

By contrast, using h5py directly:

import h5py as h5
with h5.File('test.nc', 'r') as fd:
    print(fd['TMP_L103'])

Works fine. This error carries through to xarray itself if I try to use the h5netcdf backend.

@shoyer
Copy link
Collaborator

shoyer commented Jun 7, 2016

Huh, this is a curious one. Can you share a problematic files? Or the name and/or value of the problematic variable from using pdb? (e.g., %debug in Jupyter notebook)

Note that your h5py example is not entirely analogous because printing a variable in h5py does not load it's attributes. If list(fd['TMP_L103'].attrs.items()) also does not raise then something very strange is going on.

@mangecoeur
Copy link
Author

Ah good point - list(fd['TMP_L103'].attrs.items() does return the same error with h5py.

The file in question is this and the crash occurs reading:

>>> attr.name
b'standard_name'

@shoyer
Copy link
Collaborator

shoyer commented Jun 9, 2016

Both h5dump and ncdump can read the attribute in question, so I can only presume that this is an h5py bug:

h5dump -a /TMP_L103/standard_name ~/Downloads/test\ \(1\).nc
HDF5 "/Users/shoyer/Downloads/test (1).nc" {
ATTRIBUTE "standard_name" {
   DATATYPE  H5T_STRING {
      STRSIZE 15;
      STRPAD H5T_STR_NULLTERM;
      CSET H5T_CSET_UTF8;
      CTYPE H5T_C_S1;
   }
   DATASPACE  SCALAR
   DATA {
   (0): "air_temperature"
   }
}
}

@mangecoeur
Copy link
Author

OK I will open the issue there thanks for looking at this
On 9 Jun 2016 2:55 a.m., "Stephan Hoyer" [email protected] wrote:

Both h5dump and ncdump can read the attribute in question, so I can only
presume that this is an h5py bug:

h5dump -a /TMP_L103/standard_name ~/Downloads/test\ (1).nc
HDF5 "/Users/shoyer/Downloads/test (1).nc" {
ATTRIBUTE "standard_name" {
DATATYPE H5T_STRING {
STRSIZE 15;
STRPAD H5T_STR_NULLTERM;
CSET H5T_CSET_UTF8;
CTYPE H5T_C_S1;
}
DATASPACE SCALAR
DATA {
(0): "air_temperature"
}
}
}


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
#16 (comment), or mute
the thread
https://github.com/notifications/unsubscribe/AAtYVPix8VuRAR0eIAYleF6f8tDZBeamks5qJ3KagaJpZM4Iv2Ou
.

@laliberte
Copy link
Contributor

So, I've investigated this problem a bit and I found that reading char arrays is buggy within h5py. In pull request #17, I've added a check that verifies whether reading the char array works with pure h5py before attempting with h5netcdf. At the moment, h5py fails at this operation in both 3.4 and 3.5.
Maybe the exception could be caught at line 86 of core.py and raise a more verbose notImplemented('Reading char arrays is currently not supported in python3.')

@shoyer
Copy link
Collaborator

shoyer commented Aug 7, 2016

This bug report in netCDF-C describes things more fully: Unidata/netcdf-c#298

But if only h5py supported reading fixed length UTF-8, this would be much less painful.

@shoyer
Copy link
Collaborator

shoyer commented Mar 21, 2017

The fix for this (to avoid making files that h5netcdf can't read) was just merged upstream in netcdf-c.

It looked like getting a fix for this into h5py would have been difficult, so I'm going to close this issue for now -- hopefully the offending files will eventually stop being created, as users upgrade their netCDF installations.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants