-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixes #320 #321
Fixes #320 #321
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## develop #321 +/- ##
===========================================
- Coverage 95.85% 95.81% -0.04%
===========================================
Files 39 39
Lines 2145 2101 -44
===========================================
- Hits 2056 2013 -43
+ Misses 89 88 -1 ☔ View full report in Codecov by Sentry. |
Adds a method to sample initial points within the problem class, enabling a resample method within SciPy.Minimize for an infinite initial cost, and updating prior classes to have a shared base.
Add tests for the prior cls update and SciPyMinimize resampling. Also updates sigma0 for GradientDescent and loosens assertion
@@ -25,6 +25,7 @@ def __init__(self, method=None, bounds=None, maxiter=None, tol=1e-5): | |||
super().__init__() | |||
self.method = method | |||
self.bounds = bounds | |||
self.num_resamples = 40 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This line is intentionally left without an set/get as #255 enables kwargs to this class, which will be added once that is merged.
This is now ready for review, it appears to solve the This PR splits the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @BradyPlanden - I've added a few small comments to be addressed, as well as some notes for myself regarding future updates.
if np.isinf(self.cost0): | ||
raise Exception("The initial parameter values return an infinite cost.") | ||
for i in range(1, self.num_resamples): | ||
x0 = cost_function.problem.sample_initial_conditions(seed=i) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will be updated in #322 to work with costs without a problem.
elif len(x0) != self.n_parameters: | ||
# Set initial conditions | ||
if self.x0 is None: | ||
self.x0 = self.sample_initial_conditions() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will be updated in #322
raise ValueError("x0 dimensions do not match number of parameters") | ||
|
||
# Add the initial values to the parameter definitions | ||
for i, param in enumerate(self.parameters): | ||
param.update(initial_value=self.x0[i]) | ||
|
||
def sample_initial_conditions( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will be removed as part of #322 when sampling is implemented on the new Parameters
class
One more: please add a Changelog entry - thanks! |
This commit reduces the possible variance of solutions for the parameterisation test, it also updates hyper-parameters, defends against IC samples that equal the ground truth solution. The priors are also reduced to improve robustness
Ok, this is a step in the right direction. The scheduled tests are much closer to passing with the addition of the flaky plugin and moving the flakiest tests to a Thevenin parameterisation test. I've triggered the CI three times on this PR and it's passed so far. I suspect we will get some sporadic failures after this merge, as shown on the scheduled CI above, but this will be a big improvement over the current state. @NicolaCourtier if you want to take another look that would be great, otherwise I'll merge as is with a name change to |
Thanks @BradyPlanden ! Let's proceed with the name change and merge as it looks like a significant improvement. |
Description
This PR updates the
test_parameterisation.py
integration tests. This is completed by hyper-parameter updates for the extended synthetic data and a change to the parameter priors to remove incorrect biasing to the MAP cost function.In addition, it adds a prior resampling method to improve pybop.SciPyMinimize stability. This avoids failing test when
allow_infeasible_solutions == False
as x0 will be resampling when possible. To achieve this, thepybop.Prior
classes have been refactored through the addition of a base classpybop.BasePrior
.Issue reference
Fixes #320
Review
Before you mark your PR as ready for review, please ensure that you've considered the following:
Type of change
Key checklist:
$ pre-commit run
(or$ nox -s pre-commit
) (see CONTRIBUTING.md for how to set this up to run automatically when committing locally, in just two lines of code)$ nox -s tests
$ nox -s doctest
You can run integration tests, unit tests, and doctests together at once, using
$ nox -s quick
.Further checks:
Thank you for contributing to our project! Your efforts help us to deliver great software.