-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hyperparameter optimization #264
Labels
Comments
lars-reimann
added a commit
that referenced
this issue
May 26, 2023
lars-reimann
pushed a commit
that referenced
this issue
Jun 1, 2023
## [0.13.0](v0.12.0...v0.13.0) (2023-06-01) ### Features * add `Choice` class for possible values of hyperparameter ([#325](#325)) ([d511c3e](d511c3e)), closes [#264](#264) * Add `RangeScaler` transformer ([#310](#310)) ([f687840](f687840)), closes [#141](#141) * Add methods that tell which columns would be affected by a transformer ([#304](#304)) ([3933b45](3933b45)), closes [#190](#190) * Getters for hyperparameters of Regression and Classification models ([#306](#306)) ([5c7a662](5c7a662)), closes [#260](#260) * improve error handling of table ([#308](#308)) ([ef87cc4](ef87cc4)), closes [#147](#147) * Remove warnings thrown in new `Transformer` methods ([#324](#324)) ([ca046c4](ca046c4)), closes [#323](#323)
🎉 This issue has been resolved in version 0.28.0 🎉 The release is available on:
Your semantic-release bot 📦🚀 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Is your feature request related to a problem?
Finding appropriate values for hyperparameters by hand is tedious. There should be automation to try different combinations of values.
Desired solution
T
it should also be possible to pass aChoice[T]
(see feat: addChoice
class for possible values of hyperparameter #325). Example:fit
on a model that containsChoice
at any level (can be nested), raise an exception. Also point to the correct method (see 4.).fit_by_exhaustive_search
toClassifier
and subclasses with parameter:optimization_metric
: The metric to use to find the best model. It should have typeClassifierMetric
, which is an enum with one value for each classifier metric we have available:fit_by_exhaustive_search
toRegressor
and subclasses with parameter:optimization_metric
: The metric to use to find the best model. It should have typeRegressorMetric
, which is an enum with one value for each regressor metric we have available:Choice
s inside of the model and its children, and for each possible setting create a model without choices, fit this, and compute the listed metric on it. It should then keep track of the best (fitted) model according to the metric and return it at the end.GridSearchCV
ofscikit-learn
can be useful for this.The text was updated successfully, but these errors were encountered: