Skip to content

[Custom data] validation loss is less than train loss, and opposite for the accuracy... #146

Answered by leondgarse
mewcat2011 asked this question in Q&A
Discussion options

You must be logged in to vote

Ya, it's common depending on the loss function and date augmentation. As you can see some results in imagenet#training using default BinaryCrossEntropyTimm loss, all sharing val loss < train loss, val accuracy > train accuracy. You may also try models like EfficientNetV1B0 as a baseline, this should be more stable.

Replies: 2 comments 1 reply

Comment options

You must be logged in to vote
1 reply
@mewcat2011
Comment options

Answer selected by mewcat2011
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants