You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
During distributed training, when a user-provided init_score is not given (the default), each worker determines an initial score to boost from based on its local data. These initial scores are then synced up by mean between all workers.
In situations where the distribution of the target is very different across different workers, this could lead to a lower-quality initial score, which might increase the number of boosting rounds it takes to fit to the training data.
#4332 addressed this for binary classification, but this problem is still present in other objectives. See #4332 (review).
Description
Created from #4332 (comment).
During distributed training, when a user-provided
init_score
is not given (the default), each worker determines an initial score to boost from based on its local data. These initial scores are then synced up by mean between all workers.In situations where the distribution of the target is very different across different workers, this could lead to a lower-quality initial score, which might increase the number of boosting rounds it takes to fit to the training data.
#4332 addressed this for binary classification, but this problem is still present in other objectives. See #4332 (review).
Reproducible example
Environment info
LightGBM version or commit hash: 0701a32
Command(s) you used to install LightGBM: this affects any installation of LightGBM used for distributed training.
Additional Comments
I've given this the label "dask" because it will affect training with
lightgbm.dask
, but it is not specific to this project's Dask interface. Any other distributed training option (https://lightgbm.readthedocs.io/en/latest/Parallel-Learning-Guide.html#integrations) will be affected by this as well.The text was updated successfully, but these errors were encountered: