DeepLearning/Tensor/Softmax
compute softmax of a Tensor
DeepLearning/Tensor/SoftmaxCrossEntropyWithLogits
compute softmax of a Tensor with logits
DeepLearning/Tensor/Softplus
compute softplus of a Tensor
Calling Sequence
Parameters
Options
Description
Examples
Compatibility
Softmax(t,opts)
SoftmaxCrossEntropyWithLogits(t,labels=x,logits=y,opts)
Softplus(t,opts)
t
-
Tensor
opts
zero or more options as specified below
axis=list(integer) or integer
The value of option axis is an integer or list of integers which describes which axis of the input Tensor to reduce across.
name=string
The value of option name specifies an optional name for this Tensor, to be displayed in output and when visualizing the dataflow graph.
The Softmax(t,opts) command computes the softmax function of a Tensor t,
The SoftmaxCrossEntropyWithLogits(t,labels=x,logits=y) command computes the softmax function with labels x and logits y.
The Softplus(t,opts) command computes log(exp(t)+t) of a Tensor t.
with⁡DeepLearning:
W≔Variable⁡29.,93.,−29.,−12.,−80.,96.,96.,−92.,89.,datatype=float8
W≔DeepLearning VariableName: Variable:0Shape: [3, 3]Data Type: float[8]
Softmax⁡W
DeepLearning TensorShape: [3, 3]Data Type: float[8]
Softplus⁡W
The DeepLearning/Tensor/Softmax, DeepLearning/Tensor/SoftmaxCrossEntropyWithLogits and DeepLearning/Tensor/Softplus commands were introduced in Maple 2018.
For more information on Maple 2018 changes, see Updates in Maple 2018.
See Also
DeepLearning Overview
DeepLearning,SoftmaxLayer
Download Help Document