다음을 통해 공유


TreeExtensions.FastTreeTweedie 메서드

정의

오버로드

FastTreeTweedie(RegressionCatalog+RegressionTrainers, String, String, String, Int32, Int32, Int32, Double)

만들기 FastTreeTweedieTrainer- 의사 결정 트리 회귀 모델을 사용하여 대상을 예측합니다.

FastTreeTweedie(RegressionCatalog+RegressionTrainers, FastTreeTweedieTrainer+Options)

의사 결정 트리 회귀 모델을 사용하여 대상을 예측하는 고급 옵션을 사용하여 만듭니 FastTreeTweedieTrainer 다.

FastTreeTweedie(RegressionCatalog+RegressionTrainers, String, String, String, Int32, Int32, Int32, Double)

만들기 FastTreeTweedieTrainer- 의사 결정 트리 회귀 모델을 사용하여 대상을 예측합니다.

public static Microsoft.ML.Trainers.FastTree.FastTreeTweedieTrainer FastTreeTweedie (this Microsoft.ML.RegressionCatalog.RegressionTrainers catalog, string labelColumnName = "Label", string featureColumnName = "Features", string exampleWeightColumnName = default, int numberOfLeaves = 20, int numberOfTrees = 100, int minimumExampleCountPerLeaf = 10, double learningRate = 0.2);
static member FastTreeTweedie : Microsoft.ML.RegressionCatalog.RegressionTrainers * string * string * string * int * int * int * double -> Microsoft.ML.Trainers.FastTree.FastTreeTweedieTrainer
<Extension()>
Public Function FastTreeTweedie (catalog As RegressionCatalog.RegressionTrainers, Optional labelColumnName As String = "Label", Optional featureColumnName As String = "Features", Optional exampleWeightColumnName As String = Nothing, Optional numberOfLeaves As Integer = 20, Optional numberOfTrees As Integer = 100, Optional minimumExampleCountPerLeaf As Integer = 10, Optional learningRate As Double = 0.2) As FastTreeTweedieTrainer

매개 변수

labelColumnName
String

레이블 열의 이름입니다. 열 데이터는 .이어야 Single합니다.

featureColumnName
String

기능 열의 이름입니다. 열 데이터는 알려진 크기의 벡터 Single여야 합니다.

exampleWeightColumnName
String

예제 가중치 열의 이름(선택 사항)입니다.

numberOfLeaves
Int32

의사 결정 트리당 최대 리프 수입니다.

numberOfTrees
Int32

앙상블에서 만들 의사 결정 트리의 총 수입니다.

minimumExampleCountPerLeaf
Int32

새 트리 리프를 구성하는 데 필요한 최소 데이터 요소 수입니다.

learningRate
Double

학습 속도입니다.

반환

예제

using System;
using System.Collections.Generic;
using System.Linq;
using Microsoft.ML;
using Microsoft.ML.Data;

namespace Samples.Dynamic.Trainers.Regression
{
    public static class FastTreeTweedieRegression
    {
        // This example requires installation of additional NuGet
        // package for Microsoft.ML.FastTree found at
        // https://www.nuget.org/packages/Microsoft.ML.FastTree/
        public static void Example()
        {
            // Create a new context for ML.NET operations. It can be used for
            // exception tracking and logging, as a catalog of available operations
            // and as the source of randomness. Setting the seed to a fixed number
            // in this example to make outputs deterministic.
            var mlContext = new MLContext(seed: 0);

            // Create a list of training data points.
            var dataPoints = GenerateRandomDataPoints(1000);

            // Convert the list of data points to an IDataView object, which is
            // consumable by ML.NET API.
            var trainingData = mlContext.Data.LoadFromEnumerable(dataPoints);

            // Define the trainer.
            var pipeline = mlContext.Regression.Trainers.FastTreeTweedie(
                labelColumnName: nameof(DataPoint.Label),
                featureColumnName: nameof(DataPoint.Features));

            // Train the model.
            var model = pipeline.Fit(trainingData);

            // Create testing data. Use different random seed to make it different
            // from training data.
            var testData = mlContext.Data.LoadFromEnumerable(
                GenerateRandomDataPoints(5, seed: 123));

            // Run the model on test data set.
            var transformedTestData = model.Transform(testData);

            // Convert IDataView object to a list.
            var predictions = mlContext.Data.CreateEnumerable<Prediction>(
                transformedTestData, reuseRowObject: false).ToList();

            // Look at 5 predictions for the Label, side by side with the actual
            // Label for comparison.
            foreach (var p in predictions)
                Console.WriteLine($"Label: {p.Label:F3}, Prediction: {p.Score:F3}");

            // Expected output:
            //   Label: 0.985, Prediction: 0.945
            //   Label: 0.155, Prediction: 0.104
            //   Label: 0.515, Prediction: 0.515
            //   Label: 0.566, Prediction: 0.448
            //   Label: 0.096, Prediction: 0.082

            // Evaluate the overall metrics
            var metrics = mlContext.Regression.Evaluate(transformedTestData);
            PrintMetrics(metrics);

            // Expected output:
            //   Mean Absolute Error: 0.04
            //   Mean Squared Error: 0.00
            //   Root Mean Squared Error: 0.06
            //   RSquared: 0.96 (closer to 1 is better. The worst case is 0)
        }

        private static IEnumerable<DataPoint> GenerateRandomDataPoints(int count,
            int seed = 0)
        {
            var random = new Random(seed);
            for (int i = 0; i < count; i++)
            {
                float label = (float)random.NextDouble();
                yield return new DataPoint
                {
                    Label = label,
                    // Create random features that are correlated with the label.
                    Features = Enumerable.Repeat(label, 50).Select(
                        x => x + (float)random.NextDouble()).ToArray()
                };
            }
        }

        // Example with label and 50 feature values. A data set is a collection of
        // such examples.
        private class DataPoint
        {
            public float Label { get; set; }
            [VectorType(50)]
            public float[] Features { get; set; }
        }

        // Class used to capture predictions.
        private class Prediction
        {
            // Original label.
            public float Label { get; set; }
            // Predicted score from the trainer.
            public float Score { get; set; }
        }

        // Print some evaluation metrics to regression problems.
        private static void PrintMetrics(RegressionMetrics metrics)
        {
            Console.WriteLine("Mean Absolute Error: " + metrics.MeanAbsoluteError);
            Console.WriteLine("Mean Squared Error: " + metrics.MeanSquaredError);
            Console.WriteLine(
                "Root Mean Squared Error: " + metrics.RootMeanSquaredError);

            Console.WriteLine("RSquared: " + metrics.RSquared);
        }
    }
}

적용 대상

FastTreeTweedie(RegressionCatalog+RegressionTrainers, FastTreeTweedieTrainer+Options)

의사 결정 트리 회귀 모델을 사용하여 대상을 예측하는 고급 옵션을 사용하여 만듭니 FastTreeTweedieTrainer 다.

public static Microsoft.ML.Trainers.FastTree.FastTreeTweedieTrainer FastTreeTweedie (this Microsoft.ML.RegressionCatalog.RegressionTrainers catalog, Microsoft.ML.Trainers.FastTree.FastTreeTweedieTrainer.Options options);
static member FastTreeTweedie : Microsoft.ML.RegressionCatalog.RegressionTrainers * Microsoft.ML.Trainers.FastTree.FastTreeTweedieTrainer.Options -> Microsoft.ML.Trainers.FastTree.FastTreeTweedieTrainer
<Extension()>
Public Function FastTreeTweedie (catalog As RegressionCatalog.RegressionTrainers, options As FastTreeTweedieTrainer.Options) As FastTreeTweedieTrainer

매개 변수

options
FastTreeTweedieTrainer.Options

트레이너 옵션.

반환

예제

using System;
using System.Collections.Generic;
using System.Linq;
using Microsoft.ML;
using Microsoft.ML.Data;
using Microsoft.ML.Trainers.FastTree;

namespace Samples.Dynamic.Trainers.Regression
{
    public static class FastTreeTweedieWithOptionsRegression
    {
        // This example requires installation of additional NuGet
        // package for Microsoft.ML.FastTree found at
        // https://www.nuget.org/packages/Microsoft.ML.FastTree/
        public static void Example()
        {
            // Create a new context for ML.NET operations. It can be used for
            // exception tracking and logging, as a catalog of available operations
            // and as the source of randomness. Setting the seed to a fixed number
            // in this example to make outputs deterministic.
            var mlContext = new MLContext(seed: 0);

            // Create a list of training data points.
            var dataPoints = GenerateRandomDataPoints(1000);

            // Convert the list of data points to an IDataView object, which is
            // consumable by ML.NET API.
            var trainingData = mlContext.Data.LoadFromEnumerable(dataPoints);

            // Define trainer options.
            var options = new FastTreeTweedieTrainer.Options
            {
                LabelColumnName = nameof(DataPoint.Label),
                FeatureColumnName = nameof(DataPoint.Features),
                // Use L2Norm for early stopping.
                EarlyStoppingMetric =
                    Microsoft.ML.Trainers.FastTree.EarlyStoppingMetric.L2Norm,

                // Create a simpler model by penalizing usage of new features.
                FeatureFirstUsePenalty = 0.1,
                // Reduce the number of trees to 50.
                NumberOfTrees = 50
            };

            // Define the trainer.
            var pipeline =
                mlContext.Regression.Trainers.FastTreeTweedie(options);

            // Train the model.
            var model = pipeline.Fit(trainingData);

            // Create testing data. Use different random seed to make it different
            // from training data.
            var testData = mlContext.Data.LoadFromEnumerable(
                GenerateRandomDataPoints(5, seed: 123));

            // Run the model on test data set.
            var transformedTestData = model.Transform(testData);

            // Convert IDataView object to a list.
            var predictions = mlContext.Data.CreateEnumerable<Prediction>(
                transformedTestData, reuseRowObject: false).ToList();

            // Look at 5 predictions for the Label, side by side with the actual
            // Label for comparison.
            foreach (var p in predictions)
                Console.WriteLine($"Label: {p.Label:F3}, Prediction: {p.Score:F3}");

            // Expected output:
            //   Label: 0.985, Prediction: 0.954
            //   Label: 0.155, Prediction: 0.103
            //   Label: 0.515, Prediction: 0.450
            //   Label: 0.566, Prediction: 0.515
            //   Label: 0.096, Prediction: 0.078

            // Evaluate the overall metrics
            var metrics = mlContext.Regression.Evaluate(transformedTestData);
            PrintMetrics(metrics);

            // Expected output:
            //   Mean Absolute Error: 0.04
            //   Mean Squared Error: 0.00
            //   Root Mean Squared Error: 0.05
            //   RSquared: 0.98 (closer to 1 is better. The worst case is 0)
        }

        private static IEnumerable<DataPoint> GenerateRandomDataPoints(int count,
            int seed = 0)
        {
            var random = new Random(seed);
            for (int i = 0; i < count; i++)
            {
                float label = (float)random.NextDouble();
                yield return new DataPoint
                {
                    Label = label,
                    // Create random features that are correlated with the label.
                    Features = Enumerable.Repeat(label, 50).Select(
                        x => x + (float)random.NextDouble()).ToArray()
                };
            }
        }

        // Example with label and 50 feature values. A data set is a collection of
        // such examples.
        private class DataPoint
        {
            public float Label { get; set; }
            [VectorType(50)]
            public float[] Features { get; set; }
        }

        // Class used to capture predictions.
        private class Prediction
        {
            // Original label.
            public float Label { get; set; }
            // Predicted score from the trainer.
            public float Score { get; set; }
        }

        // Print some evaluation metrics to regression problems.
        private static void PrintMetrics(RegressionMetrics metrics)
        {
            Console.WriteLine("Mean Absolute Error: " + metrics.MeanAbsoluteError);
            Console.WriteLine("Mean Squared Error: " + metrics.MeanSquaredError);
            Console.WriteLine(
                "Root Mean Squared Error: " + metrics.RootMeanSquaredError);

            Console.WriteLine("RSquared: " + metrics.RSquared);
        }
    }
}

적용 대상