Dela via


SdcaNonCalibratedMulticlassTrainer Class

Definition

TheIEstimator<TTransformer> to predict a target using a linear multiclass classifier. The trained model LinearMulticlassModelParameters produces probabilities of classes.

public sealed class SdcaNonCalibratedMulticlassTrainer : Microsoft.ML.Trainers.SdcaMulticlassTrainerBase<Microsoft.ML.Trainers.LinearMulticlassModelParameters>
type SdcaNonCalibratedMulticlassTrainer = class
    inherit SdcaMulticlassTrainerBase<LinearMulticlassModelParameters>
Public NotInheritable Class SdcaNonCalibratedMulticlassTrainer
Inherits SdcaMulticlassTrainerBase(Of LinearMulticlassModelParameters)
Inheritance

Remarks

To create this trainer, use SdcaMaximumEntropy or SdcaMaximumEntropy(Options).

Input and Output Columns

The input label column data must be key type and the feature column must be a known-sized vector of Single.

This trainer outputs the following columns:

Output Column Name Column Type Description
Score Vector of Single The scores of all classes. Higher value means higher probability to fall into the associated class. If the i-th element has the largest value, the predicted label index would be i. Note that i is zero-based index.
PredictedLabel key type The predicted label's index. If its value is i, the actual label would be the i-th category in the key-valued input label type.

Trainer Characteristics

Machine learning task Multiclass classification
Is normalization required? Yes
Is caching required? No
Required NuGet in addition to Microsoft.ML None
Exportable to ONNX Yes

Scoring Function

This trains a linear model to solve multiclass classification problems. Assume that the number of classes is $m$ and number of features is $n$. It assigns the $c$-th class a coefficient vector $\textbf{w}_c \in {\mathbb R}^n$ and a bias $b_c \in {\mathbb R}$, for $c=1,\dots,m$. Given a feature vector $\textbf{x} \in {\mathbb R}^n$, the $c$-th class's score would be $\hat{y}^c = \textbf{w}_c^T \textbf{x} + b_c$. Note that the $c$-th value in the output score column is just $\hat{y}^c$.

Training Algorithm Details

See the documentation of SdcaMulticlassTrainerBase.

Check the See Also section for links to usage examples.

Fields

FeatureColumn

The feature column that the trainer expects.

(Inherited from TrainerEstimatorBase<TTransformer,TModel>)
LabelColumn

The label column that the trainer expects. Can be null, which indicates that label is not used for training.

(Inherited from TrainerEstimatorBase<TTransformer,TModel>)
WeightColumn

The weight column that the trainer expects. Can be null, which indicates that weight is not used for training.

(Inherited from TrainerEstimatorBase<TTransformer,TModel>)

Properties

Info (Inherited from StochasticTrainerBase<TTransformer,TModel>)

Methods

Fit(IDataView)

Trains and returns a ITransformer.

(Inherited from TrainerEstimatorBase<TTransformer,TModel>)
GetOutputSchema(SchemaShape) (Inherited from TrainerEstimatorBase<TTransformer,TModel>)

Extension Methods

AppendCacheCheckpoint<TTrans>(IEstimator<TTrans>, IHostEnvironment)

Append a 'caching checkpoint' to the estimator chain. This will ensure that the downstream estimators will be trained against cached data. It is helpful to have a caching checkpoint before trainers that take multiple data passes.

WithOnFitDelegate<TTransformer>(IEstimator<TTransformer>, Action<TTransformer>)

Given an estimator, return a wrapping object that will call a delegate once Fit(IDataView) is called. It is often important for an estimator to return information about what was fit, which is why the Fit(IDataView) method returns a specifically typed object, rather than just a general ITransformer. However, at the same time, IEstimator<TTransformer> are often formed into pipelines with many objects, so we may need to build a chain of estimators via EstimatorChain<TLastTransformer> where the estimator for which we want to get the transformer is buried somewhere in this chain. For that scenario, we can through this method attach a delegate that will be called once fit is called.

Applies to

See also