DeepLearning
GradientTape
create gradient tape object
Calling Sequence
Parameters
Options
Description
Operations with GradientTapes
Details
Examples
Compatibility
GradientTape( opts )
opts
-
(optional) options as described below
persistent = truefalse
Specifies whether a persistent gradient tape is created.
When true, the data is retained until Reset is called or the GradientTape object is garbage-collected.
When false (the default), the data is automatically collected after Gradient or Jacobian is called.
GradientTape builds a gradient tape object to track one or more Tensors or Variables for the purposes of computing gradients.
This function is part of the DeepLearning package, so it can be used in the short form GradientTape(..) only after executing the command with(DeepLearning). However, it can always be accessed through the long form of the command by using DeepLearning[GradientTape](..).
The following functions can be performed with a GradientTape.
The implementation of GradientTape uses the tf.GradientTape command from the TensorFlow Python API. Consult the TensorFlow Python API documentation for tf.GradientTape for more information on random number generation during TensorFlow computations.
Compute a simple gradient of a constant Tensor.
with⁡DeepLearning:
x≔Constant⁡3.0,5.0
x≔DeepLearning TensorShape: [2]Data Type: float[4]
tape≔GradientTape⁡
tape≔DeepLearning GradientTape<tensorflow.python.eager.backprop.GradientTape object at 0x7f6e6e72a050>
Enter⁡tape
DeepLearning GradientTape<tensorflow.python.eager.backprop.GradientTape object at 0x7f6e6e72a050>
tape:-Watch⁡x
y≔x2
y≔DeepLearning TensorShape: [2]Data Type: float[4]
Exit⁡tape
grad≔tape:-Gradient⁡y,x
grad≔DeepLearning TensorShape: [2]Data Type: float[4]
grad
DeepLearning TensorShape: [2]Data Type: float[4]
The DeepLearning[GradientTape] command was introduced in Maple 2022.
For more information on Maple 2022 changes, see Updates in Maple 2022.
See Also
DeepLearning Overview
Download Help Document