You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently we have an asymmetry between the "check" and "compare" halves of the model-check tool. The "check" tests can be defined in YAML files, but so far we have no support for "compare" tests defined in YAML.
We are trying to define many more comparison tests for En-ROADS and C-ROADS, but the custom scenarios are hard to define in code, and it would be better if we could define them in YAML files (and eventually, finish the browser-based tool for defining both "check" and "compare" tests).
As a related enhancement (best implemented at the same time as adding the YAML support), I'm going to add support for "user scenarios" in the "Comparisons" section of the model-check report. We already have all the underlying support for setting up custom input scenarios, but these user scenarios will allow for customizing what we show in the report. For example, you can set up a user scenario that sets a few specific inputs in the model, and then in the detail page for that scenario, it can show specified user-oriented graphs (as the user would see in a simulator). We already have most of the support for the UI part of this implemented as well, so this is just a matter of shuffling some things around for better presentation.
The text was updated successfully, but these errors were encountered:
Currently we have an asymmetry between the "check" and "compare" halves of the model-check tool. The "check" tests can be defined in YAML files, but so far we have no support for "compare" tests defined in YAML.
We are trying to define many more comparison tests for En-ROADS and C-ROADS, but the custom scenarios are hard to define in code, and it would be better if we could define them in YAML files (and eventually, finish the browser-based tool for defining both "check" and "compare" tests).
As a related enhancement (best implemented at the same time as adding the YAML support), I'm going to add support for "user scenarios" in the "Comparisons" section of the model-check report. We already have all the underlying support for setting up custom input scenarios, but these user scenarios will allow for customizing what we show in the report. For example, you can set up a user scenario that sets a few specific inputs in the model, and then in the detail page for that scenario, it can show specified user-oriented graphs (as the user would see in a simulator). We already have most of the support for the UI part of this implemented as well, so this is just a matter of shuffling some things around for better presentation.
The text was updated successfully, but these errors were encountered: