Parameter Sampling Class
Defines Bayesian sampling over a hyperparameter search space.
Bayesian sampling tries to intelligently pick the next sample of hyperparameters, based on how the previous samples performed, such that the new sample improves the reported primary metric.
A dictionary containing each parameter and its distribution. The dictionary key is the name of the parameter. Note that only choice, quniform, and uniform are supported for Bayesian optimization.
Note that when using Bayesian sampling, the number of concurrent runs has an impact on the effectiveness of the tuning process. Typically, a smaller number of concurrent runs leads to better sampling convergence. That is because some runs start without fully benefiting from runs that are still running.
Bayesian sampling does not support early termination policies. When using Bayesian parameter sampling, use NoTerminationPolicy, set early termination policy to None, or leave off the early_termination_policy parameter.
For more information about using BayesianParameter sampling, see the tutorial Tune hyperparameters for your model.
SAMPLING_NAME = 'BayesianOptimization'