You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sorry, I didn't realise I was on a pre-release version (2.0.6.). After updating to 2.0.7. the issue is resolved. I leave the text below anyway.
Usage questions:
What does "Finished loading 120 models" mean? I get this as an output after incremental training.
Are these single trees (=models) in the Boosting tree ensemble? It is increasing every time I incrementally train (not staying the same number of decrease).
When training incrementally I get many warnings [WARNING] No further splits with positive gain, best gain: -inf Is there a "best practice" how to tackle this, to avoid these warnings?
Thanks a lot, I will leave the issue open for the usage question, but close it then.
Obsolete issue:
Hi,
my target function is a time series. I split my dataset in multiple training/testing phases.
At each training phase I want to continue training from previous models (incremental learning).
Inside a training phase I do cross validation using TimeSeriesSplit for cross validation, where I test multiple parameter settings.
At the moment I am having troubles with the console outputs. Here is a sample:
Phase 1 refers to the first training phase (there is no existing model yet).
******************************************************
INFO: fitting LightGBM in phase 1
--------------------------------------------
INFO: Start parameter tuning
INFO: Handling level0
INFO: **improved cv_error** with params={'verbose': 0, 'learning_rate': 0.01, 'num_leaves': 10, 'max_bin': 255} || current best result: 8207.489200311997
INFO: **improved cv_error** with params={'verbose': 0, 'learning_rate': 0.01, 'num_leaves': 20, 'max_bin': 255} || current best result: 8090.180625524847
INFO: tested for params={'verbose': 0, 'learning_rate': 0.01, 'num_leaves': 30, 'max_bin': 255} || result: 8252.27902379209
INFO: tested for params={'verbose': 0, 'learning_rate': 0.02, 'num_leaves': 10, 'max_bin': 255} || result: 8195.467582308242
INFO: tested for params={'verbose': 0, 'learning_rate': 0.02, 'num_leaves': 20, 'max_bin': 255} || result: 8136.15878262733
[...]
But when I enter phase 2, I get the following:
******************************************************
INFO: fitting LightGBM in phase 2
--------------------------------------------
INFO: Start parameter tuning
INFO: Handling level0
[LightGBM] [Info] Finished loading 120 models
[LightGBM] [Info] Trained a tree with leaves=10 and max_depth=6
[LightGBM] [Info] Trained a tree with leaves=10 and max_depth=6
[LightGBM] [Info] Trained a tree with leaves=10 and max_depth=6
[LightGBM] [Info] Trained a tree with leaves=10 and max_depth=6
[... Many more of these outputs...]
[LightGBM] [Info] Trained a tree with leaves=20 and max_depth=9
[LightGBM] [Info] Trained a tree with leaves=20 and max_depth=7
[LightGBM] [Info] Trained a tree with leaves=20 and max_depth=7
[LightGBM] [Info] Trained a tree with leaves=20 and max_depth=11
[LightGBM] [Info] Trained a tree with leaves=20 and max_depth=8
INFO: tested for params={'verbose': 0, 'learning_rate': 0.01, 'num_leaves': 20, 'max_bin': 255} || result: 5529.753655247278
When I set init_model=None, then in round 2 the logs "Trained a tree with leaves=XX and max_depth=XX" disappear. So I think this is a bug.
Environment info
Operating System: Windows 7 Professional Service Pack 1
CPU: Intel(R) Core(TM) i7-2640M CPU @2.8Ghz
Python version: 3.6.1., Anaconda 4.4.0, 64 bit
LightGBM 2.0.6
Further notice: I just tried the below code also on Linux Mint, and I do not see the std-output there, so may be Windows specific.
[LightGBM] [Info] Total Bins 105
[LightGBM] [Info] Number of data: 100, number of used features: 5
FINISHED training model 1
[LightGBM] [Info] No further splits with positive gain, best gain: -inf
[LightGBM] [Info] Trained a tree with leaves=4 and max_depth=3
[LightGBM] [Info] No further splits with positive gain, best gain: -inf
[LightGBM] [Info] Trained a tree with leaves=3 and max_depth=2
[LightGBM] [Info] No further splits with positive gain, best gain: -inf
[LightGBM] [Info] Trained a tree with leaves=3 and max_depth=2
[LightGBM] [Info] No further splits with positive gain, best gain: -inf
[...]
[LightGBM] [Info] No further splits with positive gain, best gain: -inf
[LightGBM] [Info] Trained a tree with leaves=4 and max_depth=2
FINISHED training model 2
The text was updated successfully, but these errors were encountered:
yes, the output was reduced when updating (sorry, I didn't realise that I was on an old version).
But I still get the Finished loading X models. Can you give me a hint what this message means? I get this as an output after each incremental training. And it seems that number is always increasing.
Sorry, I didn't realise I was on a pre-release version (2.0.6.). After updating to 2.0.7. the issue is resolved. I leave the text below anyway.
Usage questions:
What does "Finished loading 120 models" mean? I get this as an output after incremental training.
Are these single trees (=models) in the Boosting tree ensemble? It is increasing every time I incrementally train (not staying the same number of decrease).
When training incrementally I get many warnings
[WARNING] No further splits with positive gain, best gain: -inf
Is there a "best practice" how to tackle this, to avoid these warnings?Thanks a lot, I will leave the issue open for the usage question, but close it then.
Obsolete issue:
Hi,
my target function is a time series. I split my dataset in multiple training/testing phases.
At each training phase I want to continue training from previous models (incremental learning).
Inside a training phase I do cross validation using TimeSeriesSplit for cross validation, where I test multiple parameter settings.
At the moment I am having troubles with the console outputs. Here is a sample:
Phase 1 refers to the first training phase (there is no existing model yet).
But when I enter phase 2, I get the following:
When I set init_model=None, then in round 2 the logs "Trained a tree with leaves=XX and max_depth=XX" disappear. So I think this is a bug.
Environment info
Operating System: Windows 7 Professional Service Pack 1
CPU: Intel(R) Core(TM) i7-2640M CPU @2.8Ghz
Python version: 3.6.1., Anaconda 4.4.0, 64 bit
LightGBM 2.0.6
Further notice: I just tried the below code also on Linux Mint, and I do not see the std-output there, so may be Windows specific.
Reproducible code:
Output:
The text was updated successfully, but these errors were encountered: