Releases: jinlow/forust
v0.2.10
v0.2.9
v0.2.8
v0.2.7
This release introduces the ability to record metrics on data while fitting, by passing in data to the evaluation_data
parameter. Additionally early_stopping_rounds
are now supported so training will be cut short, if there is no improvement seen in performance for a specified number of iterations.
v0.2.6
v0.2.5
This release adds the method
parameter to the predict_contributions
metric. This allows either "average" (the method xgboost uses to calculate the contribution matrix when approx_contribs
is True), or "weight" to be specified. The "average" method averages the leaf weights across all nodes, and then uses this for calculating the contributions. The "weight" method instead uses the internal leaf weights, to determine how a feature impacts the final score. Both methods result in a contribution matrix, that if summed by each row, is equivalent to the output of the predict
method.
v0.2.4
This release adds an experimental parameter, create_missing_branch
, that if True, will create a separate branch for missing, creating a ternary tree, the missing node will be given the same weight value as the parent node. If this parameter is False, missing will be sent down either the left or right branch, creating a binary tree. Defaults to False.
Release v0.2.3
- Fixed bug where features were said to have zero variance, with low populated levels.
- Added
subsample
parameter to sample records prior to training each tree. - Added
missing
parameter, so missing can now be assigned to any float, rather than only beingnp.nan
. - Added
metadata
store to booster object, so info can be saved on the booster and loaded again later after saving.