Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rename path argument to "dataset" in hdf5_lookup #775

Merged
merged 3 commits into from
Aug 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ Write the date in place of the "Unreleased" in the case a new version is release
- Make `tiled.client` accept a Python dictionary when fed to `write_dataframe()`.
- The `generated_minimal` example no longer requires pandas and instead uses a Python dict.
- Remove unused pytest-warning ignores from `test_writing.py`.
- Rename argument in `hdf5_lookup` function from `path` to `dataset` to reflect change in `ophyd_async`

### Fixed
- A bug in `Context.__getstate__` caused picking to fail if applied twice.
Expand Down
10 changes: 8 additions & 2 deletions tiled/adapters/hdf5.py
Original file line number Diff line number Diff line change
Expand Up @@ -384,6 +384,7 @@ def hdf5_lookup(
libver: str = "latest",
specs: Optional[List[Spec]] = None,
access_policy: Optional[AccessPolicy] = None,
dataset: Optional[Union[List[Path], List[str]]] = None,
path: Optional[Union[List[Path], List[str]]] = None,
) -> Union[HDF5Adapter, ArrayAdapter]:
"""
Expand All @@ -397,13 +398,18 @@ def hdf5_lookup(
libver :
specs :
access_policy :
dataset :
path :

Returns
-------

"""
path = path or []

if dataset is not None and path is not None:
raise ValueError("dataset and path kwargs should not both be set!")

dataset = dataset or path or []
adapter = HDF5Adapter.from_uri(
data_uri,
structure=structure,
Expand All @@ -413,7 +419,7 @@ def hdf5_lookup(
specs=specs,
access_policy=access_policy,
)
for segment in path:
for segment in dataset:
adapter = adapter.get(segment) # type: ignore
if adapter is None:
raise KeyError(segment)
Expand Down
Loading