open_datatree
not accepting backend-specific keyword arguments
#9135
Labels
open_datatree
not accepting backend-specific keyword arguments
#9135
What happened?
Following #9014,
open_datatree
no longer passes on backend-specific keyword arguments. For instance, I have a few HDF5 datasets with unassociated dimension scales, and those requirephony_dims='sort'
orphony_dims='access'
to be opened. Now, the keyword arguments are passed ontoStoreBackendEntrypoint.open_dataset
instead ofH5netcdfBackendEntrypoint.open_dataset
which results in a TypeError.What did you expect to happen?
open_datatree
methods of backends should pass backend-specific keyword arguments toH5NetCDFStore.open
or the like.Minimal Complete Verifiable Example
MVCE confirmation
Relevant log output
No response
Anything else we need to know?
No response
Environment
INSTALLED VERSIONS
commit: None
python: 3.12.4 | packaged by conda-forge | (main, Jun 17 2024, 10:13:44) [Clang 16.0.6 ]
python-bits: 64
OS: Darwin
OS-release: 23.5.0
machine: arm64
processor: arm
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.14.3
libnetcdf: None
xarray: 2024.6.0
pandas: 2.2.2
numpy: 2.0.0
scipy: None
netCDF4: None
pydap: None
h5netcdf: 1.3.0
h5py: 3.11.0
zarr: None
cftime: None
nc_time_axis: None
iris: None
bottleneck: None
dask: None
distributed: None
matplotlib: None
cartopy: None
seaborn: None
numbagg: None
fsspec: None
cupy: None
pint: None
sparse: None
flox: None
numpy_groupies: None
setuptools: 70.0.0
pip: 24.0
conda: None
pytest: None
mypy: None
IPython: None
sphinx: None
None
The text was updated successfully, but these errors were encountered: