NormalizationCatalog.NormalizeGlobalContrast Method

Definition

Create a GlobalContrastNormalizingEstimator, which normalizes columns individually applying global contrast normalization. Setting ensureZeroMean to true, will apply a pre-processing step to make the specified column's mean be the zero vector.

public static Microsoft.ML.Transforms.GlobalContrastNormalizingEstimator NormalizeGlobalContrast (this Microsoft.ML.TransformsCatalog catalog, string outputColumnName, string inputColumnName = default, bool ensureZeroMean = true, bool ensureUnitStandardDeviation = false, float scale = 1);
static member NormalizeGlobalContrast : Microsoft.ML.TransformsCatalog * string * string * bool * bool * single -> Microsoft.ML.Transforms.GlobalContrastNormalizingEstimator
<Extension()>
Public Function NormalizeGlobalContrast (catalog As TransformsCatalog, outputColumnName As String, Optional inputColumnName As String = Nothing, Optional ensureZeroMean As Boolean = true, Optional ensureUnitStandardDeviation As Boolean = false, Optional scale As Single = 1) As GlobalContrastNormalizingEstimator

Parameters

catalog
TransformsCatalog

The transform's catalog.

outputColumnName
String

Name of the column resulting from the transformation of inputColumnName. This column's data type will be the same as the input column's data type.

inputColumnName
String

Name of the column to normalize. If set to null, the value of the outputColumnName will be used as source. This estimator operates over known-sized vectors of Single.

ensureZeroMean
Boolean

If true, subtract mean from each value before normalizing and use the raw input otherwise.

ensureUnitStandardDeviation
Boolean

If true, the resulting vector's standard deviation would be one. Otherwise, the resulting vector's L2-norm would be one.

scale
Single

Scale features by this value.

Returns

Examples

using System;
using System.Collections.Generic;
using System.Linq;
using Microsoft.ML;
using Microsoft.ML.Data;

namespace Samples.Dynamic
{
    class NormalizeGlobalContrast
    {
        public static void Example()
        {
            // Create a new ML context, for ML.NET operations. It can be used for
            // exception tracking and logging, as well as the source of randomness.
            var mlContext = new MLContext();
            var samples = new List<DataPoint>()
            {
                new DataPoint(){ Features = new float[4] { 1, 1, 0, 0} },
                new DataPoint(){ Features = new float[4] { 2, 2, 0, 0} },
                new DataPoint(){ Features = new float[4] { 1, 0, 1, 0} },
                new DataPoint(){ Features = new float[4] { 0, 1, 0, 1} }
            };
            // Convert training data to IDataView, the general data type used in
            // ML.NET.
            var data = mlContext.Data.LoadFromEnumerable(samples);
            var approximation = mlContext.Transforms.NormalizeGlobalContrast(
                "Features", ensureZeroMean: false, scale: 2,
                ensureUnitStandardDeviation: true);

            // Now we can transform the data and look at the output to confirm the
            // behavior of the estimator. This operation doesn't actually evaluate
            // data until we read the data below.
            var tansformer = approximation.Fit(data);
            var transformedData = tansformer.Transform(data);

            var column = transformedData.GetColumn<float[]>("Features").ToArray();
            foreach (var row in column)
                Console.WriteLine(string.Join(", ", row.Select(x => x.ToString(
                    "f4"))));
            // Expected output:
            //  2.0000, 2.0000,-2.0000,-2.0000
            //  2.0000, 2.0000,-2.0000,-2.0000
            //  2.0000,-2.0000, 2.0000,-2.0000
            //- 2.0000, 2.0000,-2.0000, 2.0000
        }

        private class DataPoint
        {
            [VectorType(4)]
            public float[] Features { get; set; }
        }
    }
}

Applies to