You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using the recipe presented near the bottom of http://xarray.pydata.org/en/stable/io.html for reading in multiple files via opendap, I encounter the error referenced in the subject line.
/usr/local/lib/python2.7/dist-packages/xarray/core/combine.pyc in _dataset_concat(datasets, dim, data_vars, coords, compat, positions)
231 for k, v in iteritems(ds.variables):
232 if k not in result_vars and k not in concat_over:
--> 233 raise ValueError('encountered unexpected variable %r' % k)
234 elif (k in result_coord_names) != (k in ds.coords):
235 raise ValueError('%r is a coordinate in some datasets but not '
The problem appears to be that these datasets are inconsistent: some have an nbnds variables, and some don't. Xarray doesn't know what to do in this situation, so it raises an error. You'll need to add some custom processing logic to make the datasets consistent before you merge them.
In principle, we could handle missing variables by adding NaNs, possibly by adding a flag allow_missing_variables to concat. But this doesn't exist yet.
Yes indeed. I'm embarrassed I even posted this! It looks like the nbnds and time_bnds variables were added to the yearly files at some point. Accordingly, a simple conditional checking for their existence and subsequently deleting them prior to concatenation did the trick. Too bad there is absolutely no consistency in these files at all.
Using the recipe presented near the bottom of http://xarray.pydata.org/en/stable/io.html for reading in multiple files via opendap, I encounter the error referenced in the subject line.
Here is the call to read_netcdfs:
base = 'http://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/ncep.reanalysis.dailyavgs/surface_gauss/air.2m.gauss.%4u.nc'
files = [base % d for d in range(1948,2016,1)]
ds_combined = read_netcdfs(files,dim='time',transform_func=None)
I am using option decode_cf=False in xr.open_dataset
Here is the entire content of the error message:
ValueError Traceback (most recent call last)
in ()
2 base = 'http://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/ncep.reanalysis.dailyavgs/surface_gauss/air.2m.gauss.%4u.nc'
3 files = [base % d for d in range(1948,2016,1)]
----> 4 ds_combined = read_netcdfs(files,dim='time',transform_func=None)
in read_netcdfs(files, dim, transform_func)
18 paths = files
19 datasets = [process_one_path(p) for p in paths]
---> 20 combined = xr.concat(datasets, dim)
21 return combined
/usr/local/lib/python2.7/dist-packages/xarray/core/combine.pyc in concat(objs, dim, data_vars, coords, compat, positions, indexers, mode, concat_over)
112 raise TypeError('can only concatenate xarray Dataset and DataArray '
113 'objects, got %s' % type(first_obj))
--> 114 return f(objs, dim, data_vars, coords, compat, positions)
115
116
/usr/local/lib/python2.7/dist-packages/xarray/core/combine.pyc in _dataset_concat(datasets, dim, data_vars, coords, compat, positions)
231 for k, v in iteritems(ds.variables):
232 if k not in result_vars and k not in concat_over:
--> 233 raise ValueError('encountered unexpected variable %r' % k)
234 elif (k in result_coord_names) != (k in ds.coords):
235 raise ValueError('%r is a coordinate in some datasets but not '
ValueError: encountered unexpected variable u'nbnds'
Any ideas as to how I can avoid triggering this error?
The text was updated successfully, but these errors were encountered: