Skip to content

Releases: jinlow/forust

v0.2.10

19 May 23:18
d7b30fa
Compare
Choose a tag to compare
  • Added the branch difference method for calculating contributions.
  • Normalized logloss metric, by diving it by the sun of the sample weight.

v0.2.9

18 May 19:36
b4f0d5a
Compare
Choose a tag to compare

Hot fix to add default values for GOSS parameters to booster.

v0.2.8

18 May 18:46
5f4460c
Compare
Choose a tag to compare

This release add the GOSS (Gradient Based One Side Sampling) sampling method to the package. Additionally small performance improvements, and moved the application of the learning_rate till after the monotonicity bounds are generated.

v0.2.7

15 May 13:45
234e112
Compare
Choose a tag to compare

This release introduces the ability to record metrics on data while fitting, by passing in data to the evaluation_data parameter. Additionally early_stopping_rounds are now supported so training will be cut short, if there is no improvement seen in performance for a specified number of iterations.

v0.2.6

09 May 02:00
cc33c5f
Compare
Choose a tag to compare

Made the partial dependence predictions faster, and improvements to python API.

v0.2.5

08 May 01:09
9fc288f
Compare
Choose a tag to compare

This release adds the method parameter to the predict_contributions metric. This allows either "average" (the method xgboost uses to calculate the contribution matrix when approx_contribs is True), or "weight" to be specified. The "average" method averages the leaf weights across all nodes, and then uses this for calculating the contributions. The "weight" method instead uses the internal leaf weights, to determine how a feature impacts the final score. Both methods result in a contribution matrix, that if summed by each row, is equivalent to the output of the predict method.

v0.2.4

06 May 22:11
3582194
Compare
Choose a tag to compare

This release adds an experimental parameter, create_missing_branch, that if True, will create a separate branch for missing, creating a ternary tree, the missing node will be given the same weight value as the parent node. If this parameter is False, missing will be sent down either the left or right branch, creating a binary tree. Defaults to False.

Release v0.2.3

01 May 02:03
6e18ff5
Compare
Choose a tag to compare
  • Fixed bug where features were said to have zero variance, with low populated levels.
  • Added subsample parameter to sample records prior to training each tree.
  • Added missing parameter, so missing can now be assigned to any float, rather than only being np.nan.
  • Added metadata store to booster object, so info can be saved on the booster and loaded again later after saving.

v0.2.1

23 Apr 02:39
5b53399
Compare
Choose a tag to compare

This release adds the predict_contributions method to the GradientBooster object, both for the Rust and Python API.

v.0.2.2

23 Apr 21:34
9155f60
Compare
Choose a tag to compare

Hot fix to a bug in the contributions prediction where the base_score would be ignored. Additionally, performance improvements were made, to make contributions calculations ~5X faster.