DeepLearning[GradientTape]
Gradient
compute gradient of Tensor or Variable
Jacobian
compute Jacobian of Tensor or Variable
BatchJacobian
Calling Sequence
Parameters
Options
Description
Details
Examples
Compatibility
gt:-Gradient(y, xs, opts)
gt:-Jacobian(ys, xs, opts)
gt:-BatchJacobian(ys, xs, opts)
gt
-
a GradientTape object
x, y
a Tensor or Variable object or arbitrarily nested list of these
opts
(optional) one or more options as specified below
unconnected = one of NULL, undefined, or 0
Specifies the flag value to be returned in the event that x and y are not connected when computing the derivative of y with respect to x. The default value is NULL.
gt:-Gradient(y,x) computes the gradient of y with respect to x, where x and y are Tensors or Variables or arbitrarily nested lists of these which are tracked by the GradientTape gt.
gt:-Jacobian(y,x) computes the Jacobian of y with respect to x, where x and y are Tensors or Variables or arbitrarily nested lists of these which are tracked by the GradientTape gt.
gt:-BatchJacobian(y,x) computes a number of Jacobians simultaneously. x, where x and y are Tensors or Variables or arbitrarily nested lists of these which are tracked by the GradientTape gt.
The implementations of Gradient, Jacobian, and BatchJacobian use the similarly named methods from tf.GradientTape in the TensorFlow Python API. Consult the TensorFlow Python API documentation for tf.GradientTape for more information.
with⁡DeepLearning:
g≔GradientTape⁡
g≔DeepLearning GradientTape<tensorflow.python.eager.backprop.GradientTape object at 0x7f95351faa10>
Enter⁡g
DeepLearning GradientTape<tensorflow.python.eager.backprop.GradientTape object at 0x7f95351faa10>
x≔Constant⁡1.,2.,3.,4.,datatype=float4
x≔DeepLearning TensorShape: [2, 2]Data Type: float[4]
g:-Watch⁡x
y≔x2
y≔DeepLearning TensorShape: [2, 2]Data Type: float[4]
batch_jacobian≔BatchJacobian⁡y,x
batch_jacobian≔BatchJacobian⁡DeepLearning TensorShape: [2, 2]Data Type: float[4],DeepLearning TensorShape: [2, 2]Data Type: float[4]
Exit⁡g
The DeepLearning[GradientTape][Gradient], DeepLearning[GradientTape][Jacobian] and DeepLearning[GradientTape][BatchJacobian] commands were introduced in Maple 2022.
For more information on Maple 2022 changes, see Updates in Maple 2022.
See Also
DeepLearning Overview
GradientTape
Download Help Document