FastForestRegressionTrainer.Options Class

Definition

Options for the FastForestRegressionTrainer as used in FastForest(Options).

public sealed class FastForestRegressionTrainer.Options : Microsoft.ML.Trainers.FastTree.FastForestOptionsBase
type FastForestRegressionTrainer.Options = class
inherit FastForestOptionsBase
Public NotInheritable Class FastForestRegressionTrainer.Options
Inherits FastForestOptionsBase
Inheritance

Fields

 When a root split is impossible, allow training to proceed. (Inherited from TreeOptions) Percentage of training examples used in each bag. Default is 0.7 (70%). (Inherited from TreeOptions) Number of trees in each bag (0 for disabling bagging). (Inherited from TreeOptions) Bias for calculating gradient for each feature bin for a categorical feature. (Inherited from TreeOptions) Bundle low population bins. Bundle.None(0): no bundling, Bundle.AggregateLowPopulation(1): Bundle low population, Bundle.Adjacent(2): Neighbor low population bundle. (Inherited from TreeOptions) Whether to do split based on multiple categorical feature values. (Inherited from TreeOptions) Compress the tree Ensemble. (Inherited from TreeOptions) Whether to utilize the disk or the data's native transposition facilities (where applicable) when performing the transpose. (Inherited from TreeOptions) The entropy (regularization) coefficient between 0 and 1. (Inherited from TreeOptions) Column to use for example weight. (Inherited from TrainerInputBaseWithWeight) Print execution time breakdown to ML.NET channel. (Inherited from TreeOptions) Column to use for features. (Inherited from TrainerInputBase) The feature first use penalty coefficient. (Inherited from TreeOptions) Whether to collectivize features during dataset preparation to speed up training. (Inherited from TreeOptions) The fraction of features (chosen randomly) to use on each iteration. Use 0.9 if only 90% of features is needed. Lower numbers help reduce over-fitting. (Inherited from TreeOptions) The fraction of features (chosen randomly) to use on each split. If it's value is 0.9, 90% of all features would be dropped in expectation. (Inherited from TreeOptions) The feature re-use penalty (regularization) coefficient. (Inherited from TreeOptions) The seed of the active feature selection. (Inherited from TreeOptions) Tree fitting gain confidence requirement. Only consider a gain if its likelihood versus a random choice gain is above this value. (Inherited from TreeOptions) The number of histograms in the pool (between 2 and numLeaves). (Inherited from TreeOptions) Column to use for labels. (Inherited from TrainerInputBaseWithLabel) Maximum number of distinct values (bins) per feature. (Inherited from TreeOptions) Maximum categorical split groups to consider when splitting on a categorical feature. Split groups are a collection of split points. This is used to reduce overfitting when there many categorical features. (Inherited from TreeOptions) Maximum categorical split points to consider when splitting on a categorical feature. (Inherited from TreeOptions) The minimal number of data points required to form a new tree leaf. (Inherited from TreeOptions) Minimum categorical example percentage in a bin to consider for a split. Default is 0.1% of all training examples. (Inherited from TreeOptions) Minimum categorical example count in a bin to consider for a split. (Inherited from TreeOptions) The max number of leaves in each regression tree. (Inherited from TreeOptions) The number of data points to be sampled from each leaf to find the distribution of labels. (Inherited from FastForestOptionsBase) The number of threads to use. (Inherited from TreeOptions) Total number of decision trees to create in the ensemble. (Inherited from TreeOptions) Column to use for example groupId. (Inherited from TrainerInputBaseWithGroupId) The seed of the random number generator. (Inherited from TreeOptions) Whether to shuffle the labels on every iteration. Smoothing parameter for tree regularization. (Inherited from TreeOptions) The temperature of the randomized softmax distribution for choosing the feature. (Inherited from TreeOptions) Sparsity level needed to use sparse feature representation. (Inherited from TreeOptions) Calculate metric values for train/valid/test every k rounds. (Inherited from TreeOptions)