-
Notifications
You must be signed in to change notification settings - Fork 79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Step size of fixed parameters changes after v2.12.1 due to heuristic being applied #762
Comments
Assigning zero to errors is invalid, but in the past this was not caught. So in my view it is not a bug, but a feature that this code is now giving you a warning message. How is this causing problems and why do you set errors to zero in the first place? |
Setting the |
@alexander-held Ping |
What It is straightforward to work around this in code calling The underlying issue in my view is that Would a "correction" of the final |
@alexander-held Thank you for the additional discussion.
I fully agree with that analysis, but consistently using
If iminuit should support fixing parameters by setting errors to zero, we have to agree on a reasonable behavior when the user runs migrad or another minimizer with parameters that have zero step size but are not fixed. In that case, the step size should be set to the heuristic, I think. |
I completely agree with using The only thing that has changed in my workflow is that previously I set step sizes to zero before fitting (and in addition specified what I'd like to keep fixed via I see the complication of repeated fits with new configurations where a zero step size is no longer appropriate. I agree that it makes sense to use the heuristic whenever a non-fixed parameter is encountered with zero step size. What I would like to suggest considering is not applying the heuristic to parameters that are configured as fixed. The step size for those parameters does not matter for the fit, while in scenarios with multiple fits the heuristic could be applied before the next minimization whenever the |
Ah, but then the real issue is that m.errors should return 0 for fixed parameters? I agree that this would be nicer and more intuitive, but like you said it is not possible since m.errors has this double use as actual errors and initial step sizes. I think I cannot resolve this awkwardness, that originates from how C++ Minuit2 works. For covariance matrix, where no such restrictions apply, the elements that correspond to fixed parameters are set to zero. So perhaps you could just change your workflow to use Minuit.covariance matrix instead of Minuit.errors. This way, you also get the correlations. To get the errors, do |
That would be awkward in other ways. In OOP, we try to make interfaces so that the objects always contain a valid state. A valid state for a step size is a positive value, not a zero value. That zero or negative values were accepted in the past was an error of the API. |
I will try to implement #780 to make it a bit easier. |
That would be ideal I think, but as you describe seems not possible with the double use of
The zero step size is only invalid in my view with a configuration where the corresponding parameters are not also fixed. If such parameters were allowed to float again in subsequent fits, the step size could in principle be set again to nonzero for them. Ultimately it does not make a big difference whether a user sets the step size to zero before minimization (which worked in the past) or sets the corresponding I originally opened this issue to understand whether this effect on fixed parameters was intentional, so for me everything here is now resolved. Thank you! |
You are right. I am taking the liberty here to make the API a bit simpler by requiring that errors/step sizes must be always positive whether the parameter is fixed or not. I think that is easier to understand.
I think that's a nice pattern actually. I don't like np.where very much, but this is a good use-case. Thank you, too, for the discussion, I am closing this. |
* Manually set the uncertainties for fixed parameters to zero after minimization, and remove setting the uncertainty/step_sizes during minimization, to avoid warnings from iminuit. For iminuit v2.12.2+ "assigning zero to errors is invalid, but in the past this was not caught." This approach harmonizes with cabinetry for fixed parameters for iminuit v2.12.2+. - c.f. scikit-hep/iminuit#762 - c.f. scikit-hep/cabinetry#346 - c.f. https://iminuit.readthedocs.io/en/stable/changelog.html#july-15-2022 > fix a bug in error heuristic when parameters have negative values and prevent > assigning negative values to errors * Remove tests/test_optim.py test_step_sizes_fixed_parameters_minuit as the uncertainties/step sizes values are no longer set during minimization and so are no longer in pyhf's control.
Hi, the step size guess heuristic in #759 is being applied to fixed parameters. This results in the error of fixed parameters changing during MIGRAD. Previously the step size was carried through without changes. This allowed for quickly filtering out fixed parameters when looking at best-fit values and errors without having to refer back to a mask specifying which parameters are fixed.
Example setup:
with
iminuit==v2.12.1
:with
iminuit==v2.12.2
(same behavior withv2.12.3.beta2
):The text was updated successfully, but these errors were encountered: