[Custom data] validation loss is less than train loss, and opposite for the accuracy... #146
-
Did I do something wrong if val loss is less than train loss for several models??? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
Ya, it's common depending on the loss function and date augmentation. As you can see some results in imagenet#training using default |
Beta Was this translation helpful? Give feedback.
-
As long as the final val_acc could meet expect, I think this phenomenon is not an issue. For those basic imagenet training curves, the main reason is the heavy regularization and data augmentation. Like if setting |
Beta Was this translation helpful? Give feedback.
Ya, it's common depending on the loss function and date augmentation. As you can see some results in imagenet#training using default
BinaryCrossEntropyTimm
loss, all sharingval loss < train loss, val accuracy > train accuracy
. You may also try models likeEfficientNetV1B0
as a baseline, this should be more stable.