Hinweis
Für den Zugriff auf diese Seite ist eine Autorisierung erforderlich. Sie können versuchen, sich anzumelden oder das Verzeichnis zu wechseln.
Für den Zugriff auf diese Seite ist eine Autorisierung erforderlich. Sie können versuchen, das Verzeichnis zu wechseln.
Dieser Abschnitt enthält Informationen zu integrierten BrainScript-Funktionen.
Die Deklarationen aller integrierten Funktionen finden Sie in der CNTK.core.bs neben der CNTK-Binärdatei.
Die primitiven Vorgänge und Ebenen werden im globalen Namespace deklariert. Zusätzliche Vorgänge werden in Namespaces deklariert und mit dem jeweiligen Präfix (z. B. BS.RNN.LSTMP) angegeben.
Schichten
DenseLayer{outDim, bias= true, activation=Identity, init='uniform', initValueScale=1}ConvolutionalLayer{numOutputChannels, filterShape, activation = Identity,
init = "uniform", initValueScale = 1,
stride = 1, pad = false, lowerPad = 0, upperPad = 0,bias=true}MaxPoolingLayer{filterShape, stride = 1, pad = false, lowerPad = 0, upperPad = 0}AveragePoolingLayer{filterShape, stride = 1, pad = false, lowerPad = 0, upperPad = 0}EmbeddingLayer{outDim, embeddingPath = '', transpose = false}RecurrentLSTMLayer{outputDim, cellShape = None, goBackwards = false, enableSelfStabilization = false}DelayLayer{T=1, defaultHiddenActivation=0}DropoutBatchNormalizationLayer{spatialRank = 0, initialScale = 1, normalizationTimeConstant = 0, blendTimeConstant = 0, epsilon = 0.00001, useCntkEngine = true}LayerNormalizationLayer{initialScale = 1, initialBias = 0}StabilizerLayer{}FeatureMVNLayer{}
Layer building
Aktivierungsfunktionen
Elementweise Vorgänge, unär
Abs(x)Ceil(x)Cosine(x)Clip(x, minValue, maxValue)Exp(x)Floor(x)Log(x)Negate(x)
-xBS.Boolean.Not(b)
!xReciprocal(x)Round(x)Sin(x)Sqrt(x)
Elementweise Vorgänge, binär
ElementTimes(x, y)
x .* yMinus(x, y)
x - yPlus(x, y)
x + y`LogPlus(x, y)Less(x, y)Equal(x, y)Greater(x, y)GreaterEqual(x, y)NotEqual(x, y)LessEqual(x, y)BS.Boolean.And(a, b)
BS.Boolean.Or(a, b)
BS.Boolean.Xor(a, b)
Elementweise Vorgänge, ternär
BS.Boolean.If(condition, thenVal, elseVal)
Matrixprodukt- und Konvolutionsvorgänge
Times(A, B, outputRank=1)
A * BTransposeTimes(A, B, outputRank=1)Convolution(weights, x, kernelShape, mapDims=(0), stride=(1), sharing=(true), autoPadding=(true), lowerPadding=(0), upperPadding=(0), imageLayout='CHW', maxTempMemSizeInSamples=0)Pooling(x, poolKind/*'max'|'average'*/, kernelShape, stride=(1), autoPadding=(true), lowerPadding=(0), upperPadding=(0), imageLayout='CHW')ROIPooling(x, rois, roiOutputShape, spatialScale=1.0/16.0)
Lernende Parameter und Konstanten
ParameterTensor{shape, learningRateMultiplier=1.0, init='uniform'/*|gaussian*/, initValueScale=1.0, initValue=0.0, randomSeed=-1, initFromFilePath=''}Constant{scalarValue, rows = 1, cols = 1}-
BS.Constants.Zero,BS.Constants.One
BS.Constants.True,BS.Constants.False,BS.Constants.None BS.Constants.OnesTensor (shape)BS.Constants.ZeroSequenceLike (x)
Eingänge
Input(shape, dynamicAxis='', sparse=false, tag='feature')DynamicAxis{}EnvironmentInput (propertyName)
Mean (x),InvStdDev (x)
Verlustfunktionen und Metriken
CrossEntropyWithSoftmax(targetDistribution, nonNormalizedLogClassPosteriors)
CrossEntropy(targetDistribution, classPosteriors)Logistic(label, probability)
WeightedLogistic(label, probability, instanceWeight)ClassificationError(labels, nonNormalizedLogClassPosteriors)MatrixL1Reg(matrix)MatrixL2Reg(matrix)SquareError (x, y)
Verkleinerungen
ReduceSum(z, axis=None)
ReduceLogSum(z, axis=None)
ReduceMean(z, axis=None)
ReduceMin(z, axis=None)
ReduceMax(z, axis=None)CosDistance (x, y)SumElements (z)
Schulungsvorgänge
BatchNormalization(input, scale, bias, runMean, runInvStdDev, spatial, normalizationTimeConstant = 0, blendTimeConstant = 0, epsilon = 0.00001, useCntkEngine = true, imageLayout='CHW')-
Dropout(x) Stabilize (x, enabled=true)
StabilizeElements (x, inputDim=x.dim, enabled=true)CosDistanceWithNegativeSamples (x, y, numShifts, numNegSamples)
Ändern von Vorgängen
CNTK2.Reshape (x, shape, beginAxis=0, endAxis=0)
ReshapeDimension (x, axis, shape) = CNTK2.Reshape (x, shape, beginAxis=axis, endAxis=axis + 1)
FlattenDimensions (x, axis, num) = CNTK2.Reshape (x, 0, beginAxis=axis, endAxis=axis + num)
SplitDimension (x, axis, N) = ReshapeDimension (x, axis, 0:N)Slice (beginIndex, endIndex, input, axis=1)
BS.Sequences.First (x) = Slice (0, 1, x, axis=-1)
BS.Sequences.Last (x) = Slice (-1, 0, x, axis=-1)Splice (inputs, axis=1)TransposeDimensions (x, axis1, axis2)
Transpose (x) = TransposeDimensions (x, 1, 2)BS.Sequences.BroadcastSequenceAs (type, data1)BS.Sequences.Gather (where, x)
BS.Sequences.Scatter (where, y)
BS.Sequences.IsFirst (x)
BS.Sequences.IsLast (x)
Wiederholung
OptimizedRNNStack(weights, input, hiddenDims, numLayers=1, bidirectional=false, recurrentOp='lstm')BS.Loop.Previous (x, timeStep=1, defaultHiddenActivation=0)
PastValue (shape, x, defaultHiddenActivation=0.1, ...) = BS.Loop.Previous (0, shape, ...)BS.Loop.Next (x, timeStep=1, defaultHiddenActivation=0)
FutureValue (shape, x, defaultHiddenActivation=0.1, ...) = BS.Loop.Next (0, shape, ...)LSTMP (outputDim, cellDim=outputDim, x, inputDim=x.shape, aux=BS.Constants.None, auxDim=aux.shape, prevState, enableSelfStabilization=false)BS.Boolean.Toggle (clk, initialValue=BS.Constants.False)BS.RNNs.RecurrentLSTMP (outputDim, cellDim=outputDim, x, inputDim=x.shape, previousHook=BS.RNNs.PreviousHC, augmentInputHook=NoAuxInputHook, augmentInputDim=0, layerIndex=0, enableSelfStabilization=false)BS.RNNs.RecurrentLSTMPStack (layerShapes, cellDims=layerShapes, input, inputShape=input.shape, previousHook=PreviousHC, augmentInputHook=NoAuxInputHook, augmentInputShape=0, enableSelfStabilization=false)BS.RNNs.RecurrentBirectionalLSTMPStack (layerShapes, cellDims=layerShapes, input, inputShape=input.dim, previousHook=PreviousHC, nextHook=NextHC, enableSelfStabilization=false)
Sequenz-zu-Sequenz-Unterstützung
BS.Seq2Seq.CreateAugmentWithFixedWindowAttentionHook (attentionDim, attentionSpan, decoderDynamicAxis, encoderOutput, enableSelfStabilization=false)BS.Seq2Seq.GreedySequenceDecoderFrom (modelAsTrained)BS.Seq2Seq.BeamSearchSequenceDecoderFrom (modelAsTrained, beamDepth)
Sondervorgänge
ClassBasedCrossEntropyWithSoftmax (labelClassDescriptorVectorSequence, mainInputInfo, mainWeight, classLogProbsBeforeSoftmax)
Modellbearbeitung
BS.Network.Load (pathName)BS.Network.Edit (inputModel, editFunctions, additionalRoots)BS.Network.CloneFunction (inputNodes, outputNodes, parameters="learnable" /*|"constant"|"shared"*/)
Andere
Fail (what)IsSameObject (a, b)Trace (node, say='', logFrequency=traceFrequency, logFirst=10, logGradientToo=false, onlyUpToRow=100000000, onlyUpToT=100000000, format=[])
Veraltet
ErrorPrediction(labels, nonNormalizedLogClassPosteriors)ColumnElementTimes (...) = ElementTimes (...)DiagTimes (...) = ElementTimes (...)LearnableParameter(...) = Parameter(...)LookupTable (embeddingMatrix, inputTensor)RowRepeat (input, numRepeats)RowSlice (beginIndex, numRows, input) = Slice(beginIndex, beginIndex + numRows, input, axis = 1)RowStack (inputs)RowElementTimes (...) = ElementTimes (...)Scale (...) = ElementTimes (...)ConstantTensor (scalarVal, shape)
Parameter (outputDim, inputDim, ...) = ParameterTensor ((outputDim:input), ...)
WeightParam (outputDim, inputDim) = Parameter (outputDim, inputDim, init='uniform', initValueScale=1, initOnCPUOnly=true, randomSeed=1)
DiagWeightParam (outputDim) = ParameterTensor ((outputDim), init='uniform', initValueScale=1, initOnCPUOnly=true, randomSeed=1)
BiasParam (dim) = ParameterTensor ((dim), init='fixedValue', value=0.0)
ScalarParam() = BiasParam (1)SparseInput (shape, dynamicAxis='', tag='feature')
ImageInput (imageWidth, imageHeight, imageChannels, imageLayout='CHW', dynamicAxis='', tag='feature')
SparseImageInput (imageWidth, imageHeight, imageChannels, imageLayout='CHW', dynamicAxis='', tag='feature')MeanVarNorm(feat) = PerDimMeanVarNormalization(feat, Mean (feat), InvStdDev (feat))
PerDimMeanVarNormalization (x, mean, invStdDev),
PerDimMeanVarDeNormalization (x, mean, invStdDev)ReconcileDynamicAxis (dataInput, layoutInput)