DeepLearning
Optimizer
create an Optimizer object
Calling Sequence
Parameters
Description
Supported Optimizers
Details
Examples
Compatibility
Optimizer(X)
X
-
function; optimizer name with parameters.
The Optimizer(X) command creates an Optimizer using the specified function X with parameters.
An Optimizer computes gradients for a loss function and applies gradients to variables. Among the implemented optimizers are included classic optimization algorithms such as gradient descent and Adagrad.
In the Usage column in the table below, all arguments must be real-valued input parameters. The input learning_rate is mandatory for all optimizers, the remaining parameters may be omitted and will revert to default values.
Distribution
Usage
Gradient Descent
GradientDescent(learning_rate)
Gradient descent algorithm
Adadelta
Adadelta(learning_rate,rho,epsilon)
Adadelta algorithm
Adagrad
Adagrad(learning_rate,initial_accumulator_value)
Adagrad algorithm
Adam
Adam(learning_rate,beta1,beta2)
Adam algorithm
Ftrl
Ftrl(learning_rate,learning_rate_power,initial_accumulator_value,l1_regularization_strength,l2_regularization_strength)
FTRL algorithm
ProximalGradientDescent
ProximalGradientDescent(learning_rate,l1_regularization_strength,l2_regularization_strength)
Proximal gradient descent algorithm
RMSProp
RMSProp(learning_rate,decay,momentum,epsilon)
RMSProp algorithm
This function is part of the DeepLearning package, so it can be used in the short form Optimizer(..) only after executing the command with(DeepLearning). However, it can always be accessed through the long form of the command by using DeepLearning[Optimizer](..).
For more information on any of the above optimizers and the meaning of the input parameters, consult the TensorFlow Python API documentation for tf.train.
with⁡DeepLearning:
f≔Optimizer⁡Adagrad⁡0.01
f≔DeepLearning Optimizer<keras.src.optimizers.adagrad.Adagrad object at 0x7f9f6b5a9c10>
g≔Optimizer⁡Adam⁡0.05
g≔DeepLearning Optimizer<keras.src.optimizers.adam.Adam object at 0x7f9f2ad7c990>
The DeepLearning[Optimizer] command was introduced in Maple 2018.
For more information on Maple 2018 changes, see Updates in Maple 2018.
See Also
DeepLearning Overview
Download Help Document