Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to do hyperparameter tuning with DeepCTR? #274

Open
CFZhai opened this issue Feb 18, 2023 · 4 comments
Open

How to do hyperparameter tuning with DeepCTR? #274

CFZhai opened this issue Feb 18, 2023 · 4 comments
Labels
question Further information is requested

Comments

@CFZhai
Copy link

CFZhai commented Feb 18, 2023

e.g, Could you show an example of how to do hyperparameter tuning with DeepFM?
thank you!

@CFZhai CFZhai added the question Further information is requested label Feb 18, 2023
@alibugra
Copy link

I suggest you to use Hyperopt. http://hyperopt.github.io/hyperopt/

@CFZhai
Copy link
Author

CFZhai commented Feb 24, 2023

Do you have any example of using Hyperopt and DeepCTR together? Thank you, Alibugra!

@alibugra
Copy link

alibugra commented Mar 5, 2023

I prepared an example via the file "examples/run_classification_criteo.py". In this file, you can delete the code where the model part is defined (from Line 54 to Line 66) and then add the following code.

    def objective_function(param_space):
        dnn_hidden_units = param_space["dnn_hidden_units"]
        dnn_dropout = param_space["dnn_dropout"]

        model = DeepFM(linear_feature_columns=linear_feature_columns, dnn_feature_columns=dnn_feature_columns,
                       task='binary',
                       dnn_hidden_units=dnn_hidden_units, dnn_dropout=dnn_dropout,
                       l2_reg_embedding=1e-5, device=device)

        model.compile("adagrad", "binary_crossentropy",
                      metrics=["binary_crossentropy", "auc"], )

        history = model.fit(train_model_input, train[target].values, batch_size=32, epochs=10, verbose=2,
                            validation_split=0.2)
        pred_ans = model.predict(test_model_input, 256)
        print("")
        print("test LogLoss", round(log_loss(test[target].values, pred_ans), 4))
        auc = round(roc_auc_score(test[target].values, pred_ans), 4)
        print("test AUC", auc)

        return {
            "loss": -auc,
            "status": STATUS_OK,
            "dnn_hidden_units": dnn_hidden_units,
            "dnn_dropout": dnn_dropout
        }

    trials = Trials()
    param_space = {
        "dnn_hidden_units": hp.choice("dnn_hidden_units",
                                      [(128, 128), (256, 256)]),
        "dnn_dropout": hp.choice("dnn_dropout", [0, 0.1])
    }
    best = fmin(fn=objective_function, space=param_space,
                algo=tpe.suggest, max_evals=20, trials=trials)
    print("best parameter is:", str(best))

Do not forget to install Hyperopt and related libraries, also add them to the code.

import hyperopt
from hyperopt import fmin, tpe, hp, STATUS_OK, Trials

@CFZhai
Copy link
Author

CFZhai commented Mar 5, 2023

That is great! Thank you so much!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants