You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the neural networks section we tell people to do a 4 way cross validation and once those are complete to retrain on the entire dataset. Is this really the right way to do things? In other neural net frameworks (e.g. keras) you can do multiple iterations of training, so that final retraining isn't needed since the training at each split will continue from where the previous one left off. Can scikit learn behave like this?
The text was updated successfully, but these errors were encountered:
the training at each split will continue from where the previous one left off
I'm not sure if I understand you correctly, but I believe that in CV each iteration is supposed to be independent from the rest. There might exist some tricks which keras uses to minimize computation, could you give the name of specific function in keras or other framework?
I think the example is good as long as metrics are good on CV results.
In the neural networks section we tell people to do a 4 way cross validation and once those are complete to retrain on the entire dataset. Is this really the right way to do things? In other neural net frameworks (e.g. keras) you can do multiple iterations of training, so that final retraining isn't needed since the training at each split will continue from where the previous one left off. Can scikit learn behave like this?
The text was updated successfully, but these errors were encountered: