Not
Åtkomst till denna sida kräver auktorisation. Du kan prova att logga in eller byta katalog.
Åtkomst till denna sida kräver auktorisation. Du kan prova att byta katalog.
Det här avsnittet innehåller information om inbyggda Funktioner i BrainScript.
Deklarationerna för alla inbyggda funktioner finns i CNTK.core.bs som finns bredvid CNTK-binärfilen.
De primitiva åtgärderna och lagren deklareras i det globala namnområdet. Ytterligare åtgärder deklareras i namnområden och ges med respektive prefix (t.ex. BS.RNN.LSTMP).
Lager
DenseLayer{outDim, bias= true, activation=Identity, init='uniform', initValueScale=1}ConvolutionalLayer{numOutputChannels, filterShape, activation = Identity,
init = "uniform", initValueScale = 1,
stride = 1, pad = false, lowerPad = 0, upperPad = 0,bias=true}MaxPoolingLayer{filterShape, stride = 1, pad = false, lowerPad = 0, upperPad = 0}AveragePoolingLayer{filterShape, stride = 1, pad = false, lowerPad = 0, upperPad = 0}EmbeddingLayer{outDim, embeddingPath = '', transpose = false}RecurrentLSTMLayer{outputDim, cellShape = None, goBackwards = false, enableSelfStabilization = false}DelayLayer{T=1, defaultHiddenActivation=0}DropoutBatchNormalizationLayer{spatialRank = 0, initialScale = 1, normalizationTimeConstant = 0, blendTimeConstant = 0, epsilon = 0.00001, useCntkEngine = true}LayerNormalizationLayer{initialScale = 1, initialBias = 0}StabilizerLayer{}FeatureMVNLayer{}
Lagerbyggnad
Aktiveringsfunktioner
Elementwise-åtgärder, unary
Abs(x)Ceil(x)Cosine(x)Clip(x, minValue, maxValue)Exp(x)Floor(x)Log(x)Negate(x)
-xBS.Boolean.Not(b)
!xReciprocal(x)Round(x)Sin(x)Sqrt(x)
Elementwise-åtgärder, binärt
ElementTimes(x, y)
x .* yMinus(x, y)
x - yPlus(x, y)
x + y`LogPlus(x, y)Less(x, y)Equal(x, y)Greater(x, y)GreaterEqual(x, y)NotEqual(x, y)LessEqual(x, y)BS.Boolean.And(a, b)
BS.Boolean.Or(a, b)
BS.Boolean.Xor(a, b)
Elementwise-åtgärder, ternary
BS.Boolean.If(condition, thenVal, elseVal)
Matrisprodukt- och convolution-åtgärder
Times(A, B, outputRank=1)
A * BTransposeTimes(A, B, outputRank=1)Convolution(weights, x, kernelShape, mapDims=(0), stride=(1), sharing=(true), autoPadding=(true), lowerPadding=(0), upperPadding=(0), imageLayout='CHW', maxTempMemSizeInSamples=0)Pooling(x, poolKind/*'max'|'average'*/, kernelShape, stride=(1), autoPadding=(true), lowerPadding=(0), upperPadding=(0), imageLayout='CHW')ROIPooling(x, rois, roiOutputShape, spatialScale=1.0/16.0)
Lärbara parametrar och konstanter
ParameterTensor{shape, learningRateMultiplier=1.0, init='uniform'/*|gaussian*/, initValueScale=1.0, initValue=0.0, randomSeed=-1, initFromFilePath=''}Constant{scalarValue, rows = 1, cols = 1}-
BS.Constants.Zero,BS.Constants.One
BS.Constants.True,BS.Constants.False,BS.Constants.None BS.Constants.OnesTensor (shape)BS.Constants.ZeroSequenceLike (x)
Ingångar
Input(shape, dynamicAxis='', sparse=false, tag='feature')DynamicAxis{}EnvironmentInput (propertyName)
Mean (x),InvStdDev (x)
Förlustfunktioner och mått
CrossEntropyWithSoftmax(targetDistribution, nonNormalizedLogClassPosteriors)
CrossEntropy(targetDistribution, classPosteriors)Logistic(label, probability)
WeightedLogistic(label, probability, instanceWeight)ClassificationError(labels, nonNormalizedLogClassPosteriors)MatrixL1Reg(matrix)MatrixL2Reg(matrix)SquareError (x, y)
Minskningar
ReduceSum(z, axis=None)
ReduceLogSum(z, axis=None)
ReduceMean(z, axis=None)
ReduceMin(z, axis=None)
ReduceMax(z, axis=None)CosDistance (x, y)SumElements (z)
Träningsåtgärder
BatchNormalization(input, scale, bias, runMean, runInvStdDev, spatial, normalizationTimeConstant = 0, blendTimeConstant = 0, epsilon = 0.00001, useCntkEngine = true, imageLayout='CHW')-
Dropout(x) Stabilize (x, enabled=true)
StabilizeElements (x, inputDim=x.dim, enabled=true)CosDistanceWithNegativeSamples (x, y, numShifts, numNegSamples)
Omforma åtgärder
CNTK2.Reshape (x, shape, beginAxis=0, endAxis=0)
ReshapeDimension (x, axis, shape) = CNTK2.Reshape (x, shape, beginAxis=axis, endAxis=axis + 1)
FlattenDimensions (x, axis, num) = CNTK2.Reshape (x, 0, beginAxis=axis, endAxis=axis + num)
SplitDimension (x, axis, N) = ReshapeDimension (x, axis, 0:N)Slice (beginIndex, endIndex, input, axis=1)
BS.Sequences.First (x) = Slice (0, 1, x, axis=-1)
BS.Sequences.Last (x) = Slice (-1, 0, x, axis=-1)Splice (inputs, axis=1)TransposeDimensions (x, axis1, axis2)
Transpose (x) = TransposeDimensions (x, 1, 2)BS.Sequences.BroadcastSequenceAs (type, data1)BS.Sequences.Gather (where, x)
BS.Sequences.Scatter (where, y)
BS.Sequences.IsFirst (x)
BS.Sequences.IsLast (x)
Återkommande
OptimizedRNNStack(weights, input, hiddenDims, numLayers=1, bidirectional=false, recurrentOp='lstm')BS.Loop.Previous (x, timeStep=1, defaultHiddenActivation=0)
PastValue (shape, x, defaultHiddenActivation=0.1, ...) = BS.Loop.Previous (0, shape, ...)BS.Loop.Next (x, timeStep=1, defaultHiddenActivation=0)
FutureValue (shape, x, defaultHiddenActivation=0.1, ...) = BS.Loop.Next (0, shape, ...)LSTMP (outputDim, cellDim=outputDim, x, inputDim=x.shape, aux=BS.Constants.None, auxDim=aux.shape, prevState, enableSelfStabilization=false)BS.Boolean.Toggle (clk, initialValue=BS.Constants.False)BS.RNNs.RecurrentLSTMP (outputDim, cellDim=outputDim, x, inputDim=x.shape, previousHook=BS.RNNs.PreviousHC, augmentInputHook=NoAuxInputHook, augmentInputDim=0, layerIndex=0, enableSelfStabilization=false)BS.RNNs.RecurrentLSTMPStack (layerShapes, cellDims=layerShapes, input, inputShape=input.shape, previousHook=PreviousHC, augmentInputHook=NoAuxInputHook, augmentInputShape=0, enableSelfStabilization=false)BS.RNNs.RecurrentBirectionalLSTMPStack (layerShapes, cellDims=layerShapes, input, inputShape=input.dim, previousHook=PreviousHC, nextHook=NextHC, enableSelfStabilization=false)
Stöd för sekvens-till-sekvens
BS.Seq2Seq.CreateAugmentWithFixedWindowAttentionHook (attentionDim, attentionSpan, decoderDynamicAxis, encoderOutput, enableSelfStabilization=false)BS.Seq2Seq.GreedySequenceDecoderFrom (modelAsTrained)BS.Seq2Seq.BeamSearchSequenceDecoderFrom (modelAsTrained, beamDepth)
Specialåtgärder
ClassBasedCrossEntropyWithSoftmax (labelClassDescriptorVectorSequence, mainInputInfo, mainWeight, classLogProbsBeforeSoftmax)
Modellredigering
BS.Network.Load (pathName)BS.Network.Edit (inputModel, editFunctions, additionalRoots)BS.Network.CloneFunction (inputNodes, outputNodes, parameters="learnable" /*|"constant"|"shared"*/)
Annan
Fail (what)IsSameObject (a, b)Trace (node, say='', logFrequency=traceFrequency, logFirst=10, logGradientToo=false, onlyUpToRow=100000000, onlyUpToT=100000000, format=[])
Deprecated
ErrorPrediction(labels, nonNormalizedLogClassPosteriors)ColumnElementTimes (...) = ElementTimes (...)DiagTimes (...) = ElementTimes (...)LearnableParameter(...) = Parameter(...)LookupTable (embeddingMatrix, inputTensor)RowRepeat (input, numRepeats)RowSlice (beginIndex, numRows, input) = Slice(beginIndex, beginIndex + numRows, input, axis = 1)RowStack (inputs)RowElementTimes (...) = ElementTimes (...)Scale (...) = ElementTimes (...)ConstantTensor (scalarVal, shape)
Parameter (outputDim, inputDim, ...) = ParameterTensor ((outputDim:input), ...)
WeightParam (outputDim, inputDim) = Parameter (outputDim, inputDim, init='uniform', initValueScale=1, initOnCPUOnly=true, randomSeed=1)
DiagWeightParam (outputDim) = ParameterTensor ((outputDim), init='uniform', initValueScale=1, initOnCPUOnly=true, randomSeed=1)
BiasParam (dim) = ParameterTensor ((dim), init='fixedValue', value=0.0)
ScalarParam() = BiasParam (1)SparseInput (shape, dynamicAxis='', tag='feature')
ImageInput (imageWidth, imageHeight, imageChannels, imageLayout='CHW', dynamicAxis='', tag='feature')
SparseImageInput (imageWidth, imageHeight, imageChannels, imageLayout='CHW', dynamicAxis='', tag='feature')MeanVarNorm(feat) = PerDimMeanVarNormalization(feat, Mean (feat), InvStdDev (feat))
PerDimMeanVarNormalization (x, mean, invStdDev),
PerDimMeanVarDeNormalization (x, mean, invStdDev)ReconcileDynamicAxis (dataInput, layoutInput)