Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create surrogate for entmoot strategy #285

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

TobyBoyne
Copy link
Collaborator

Surrogate for Entmoot as discussed in #278.

Note that the surrogate is not called LGBM, since this interfaces with the Enting model, which contains mean/uncertainty models. Each of those models is not necessarily an LGBM model (although this is currently the only one implemented in ENTMOOT). Therefore I feel that this being an EntingSurrogate is the right level of abstraction (and then refactoring the strategy from EntingStrategy to EntmootStrategy).

Copy link
Contributor

@jduerholt jduerholt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added some comments ;)

from bofire.data_models.surrogates.trainable import TrainableSurrogate


class EntingSurrogate(Surrogate, TrainableSurrogate):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you want to do it like this, I would call it LGBMEntingSurrogate, and if you have xgb support at some point you could introduce an XGBEntingSurrogate. This has the advantage that the names of hyperparameters do not have to be the same but different, or if one hyperparam is only available in one model you only provide it in this model.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The other option would be to have a LGBMSurrogate which just includes the mean model and implement LGBMEntingSurrogate as inherited class from LGBMSurrogate and add the uncertainty parameters there. But I know that in the current Entmoot model it is setup a bit different, so you would have in this case two different _fit methods. The first just uses LGBM and the second one ENTMOOT.

Thinking this further one could even build it up by having an LGBMSurrogate and XGBSurrogate class which are ENTMOOT agnostic and all care for the mean model and an EntingUncertaintySurrogate which adds the uncertainty part to the model. Then one could build up the LGBMEntingSurrogate by inheritance from LGBMSurrogate and EntingUncertaintySurrogate. This would be the super object oriented solution, but as these kind of structures are currently not supported by ENTMOOT, I think its is ok to just go with the flat LGBMEntingSurrogate.

What do you think?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants