Skip to content

Commit

Permalink
🩹 Fix result data overwritten when using multiple dataset_groups (#1147)
Browse files Browse the repository at this point in the history
* 🧪 Extended test reproducing overwriting existing result dataset

* 🩹 Filter datasets in create_result_data.create_result_data to used ones

This prevents overwriting existing data from a different dataset model.

* 🚧📚 Added change to changelog
  • Loading branch information
s-weigand authored Oct 16, 2022
1 parent 69bb7dc commit 2e5df6e
Show file tree
Hide file tree
Showing 3 changed files with 11 additions and 1 deletion.
2 changes: 2 additions & 0 deletions changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@

### 🩹 Bug fixes

- 🩹 Fix result data overwritten when using multiple dataset_groups (#1147)

### 📚 Documentation

### 🗑️ Deprecations (due in 0.9.0)
Expand Down
6 changes: 5 additions & 1 deletion glotaran/optimization/optimization_group.py
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,11 @@ def create_result_data(self) -> dict[str, xr.Dataset]:
dict[str, xr.Dataset]
The datasets with the results.
"""
result_datasets = {label: data.copy() for label, data in self._data.items()}
result_datasets = {
label: data.copy()
for label, data in self._data.items()
if label in self._dataset_group.dataset_models.keys()
}

global_matrices, matrices = self._matrix_provider.get_result()
clps, residuals = self._estimation_provider.get_result()
Expand Down
4 changes: 4 additions & 0 deletions glotaran/optimization/test/test_multiple_goups.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,3 +65,7 @@ def test_multiple_groups():
for label, param in result.optimized_parameters.all():
if param.vary:
assert np.allclose(param.value, wanted_parameters.get(label).value, rtol=1e-1)

for dataset in result.data.values():
assert "weighted_root_mean_square_error" in dataset.attrs
assert "fitted_data" in dataset.data_vars

0 comments on commit 2e5df6e

Please sign in to comment.