-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pass the loss
from the compile
call to the target_encoder instantiation
#277
Comments
loss
from the compile
call to the target_encoder instantiationfloss
from the compile
call to the target_encoder instantiation
errata: idk why it was not failing before, but now I get this exception when setting categorical_crossentropy:
Which makes sense. Still my proposal holds. My solution was to subclass kerasclassifier and add a custom target_encoder that always "uses" categorical_crossentropy. |
Thank you for the detailed issue report. Currently the transformers are initialized and fit before the model is created, so there's no introspection possible: Lines 828 to 835 in d50e75a
If we switched the order, the model building function won't have access to certain metadata which is pretty useful for dynamically creating models: scikeras/scikeras/utils/transformers.py Lines 306 to 311 in d50e75a
So your goal is to have automatically choose a loss based on the input data, right? Currently it works the other way around: you can hardcode the loss to |
Based on the NOTE: Don't feel that I'm imposing this. I'm just raising something that caught my attention. Perhaps there is another solution than automatically setting the |
yup sorry bad wording on my point, I'm referring to Is there a problem with the loss always being |
Even for binary classification? Would that affect how the target_encoder is initialized for binary classification? |
I think it should still work for binary classification, yes. But I'm looking at the MLPClassifier notebook/guide again. It is already dynamically setting the loss function. It uses |
I would test soon and give you feedback. Thanks for your suggestions |
Scikeras version:
0.8.0
(I feel) related to #206
I was following the MLPClassifier tutorial on the wiki page. It was great that the model function could handle binary and multi-class classification. However, I encountered this issue while executing the test.:
My
y
has6
classes. I'm using directlyKerasClassifier
i.e. no sub-classing. This is how I was creating the classifierInitially, I was passing the
loss
on theKerasClassifier
parameter, and it was training fine. But since I wanted to make my model as plug-and-play as possible, I moved the loss setting inside the model function. This is where the exception was starting to show up. I took a look at how scikeras initializes the target encoder:scikeras/scikeras/wrappers.py
Lines 1395 to 1415 in d50e75a
scikeras/scikeras/utils/transformers.py
Lines 154 to 175 in d50e75a
Before it was using one-hot encoding because I was passing
loss='categorical_crossentropy'
to KerasClassifier.What ended up working for me was to still use
loss='categorical_crossentropy'
. It looks like it doesn't affect scores by using sklearnscross_validate
(correct me if I'm wrong), and also it doesn't affect that the target_encoder would use ordinal encoding. The drawback of this solution is that it doesn't look suitable and may confuse new-comers.Other solutions that I thought to solve my particular problem were:
To finally solve this issue, I propose to extract the
loss
(and perhaps theoptimizer
?) from the model, I suppose around these lines (I don't have any experience on this repository)scikeras/scikeras/wrappers.py
Lines 897 to 901 in d50e75a
The text was updated successfully, but these errors were encountered: