Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
At present, optimizers need to have reached the target number of precursors before they can converge. This poses a problem if the optimizer tries a value so narrow that it is difficult to identify the target number. However, such values are not likely to optimize the feature value (indeed, if the feature value is well chosen they shouldn't, as we should not pick parameter values that reduce precursor counts so much). This PR introduces a new set of methods which record when optimization has been skipped and considers optimization to have converged if the optimizer is skipped too many times (as defined by max_skips in the calibration field of the config). The max_skips field has a default value of 1 and is unlikely to be worth changing as long as we use an exponential batch plan, since skipping an optimizer more than once implies that performance in precursors identifications is at least roughly twice as bad as before.