GBM {Microsoft.VisualBasic.MachineLearning.XGBoost.train} | .NET clr documentation |
Tiny implement of Gradient Boosting tree It is a Tiny implement of Gradient Boosting tree, based on XGBoost's scoring function and SLIQ's efficient tree building algorithm. TGBoost build the tree in a level-wise way as in SLIQ (by constructing Attribute list and Class list). Currently, TGBoost support parallel learning on single machine, the speed and memory consumption are comparable to XGBoost.
TGBoost supports most features As other library:
+ Built-in loss , Square error loss for regression task, Logistic loss for classification task + Early stopping, evaluate On validation Set And conduct early stopping + Feature importance, output the feature importance after training + Regularization , lambda, gamma + Randomness, subsampleļ¼colsample + Weighted loss Function , assign weight To Each sample
Another two features are novel:
+ Handle missing value, XGBoost learn a direction For those With missing value, the direction Is left Or right. TGBoost take a different approach: it enumerate missing value go To left child, right child And missing value child, Then choose the best one. So TGBoost use Ternary Tree. + Handle categorical feature, TGBoost order the categorical feature by their statistic (Gradientsum / Hessiansum) On Each tree node, Then conduct split finding As numeric feature. https://github.com/wepe/tgboost
# namespace Microsoft.VisualBasic.MachineLearning.XGBoost.train
export class GBM extends Model {
eta: double;
first_round_pred: double;
loss: Loss;
trees: iterates(Tree);
}
loss
: Losstrees
: iterates(Tree)