Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Performs the gaussian error linear unit (GELU) activation function on every element in InputTensor, placing the result into the corresponding element of OutputTensor.
f(x) = 0.5 * x * (1.0 + erf(x / sqrt(2)))
Where erf(x) is DML_ELEMENT_WISE_ERF_OPERATOR_DESC.
Important
This API is available as part of the DirectML standalone redistributable package (see Microsoft.AI.DirectML version 1.9 and later. Also see DirectML version history.
Syntax
struct DML_ACTIVATION_GELU_OPERATOR_DESC
{
const DML_TENSOR_DESC* InputTensor;
const DML_TENSOR_DESC* OutputTensor;
};
Members
InputTensor
Type: const DML_TENSOR_DESC*
The input tensor to read from.
OutputTensor
Type: const DML_TENSOR_DESC*
The output tensor to write the results to.
Availability
This operator was introduced in DML_FEATURE_LEVEL_5_1.
Tensor constraints
InputTensor and OutputTensor must have the same DataType, DimensionCount, and Sizes.
Tensor support
Tensor | Kind | Supported dimension counts | Supported data types |
---|---|---|---|
InputTensor | Input | 1 to 8 | FLOAT32, FLOAT16 |
OutputTensor | Output | 1 to 8 | FLOAT32, FLOAT16 |
Requirements
Header | directml.h |