Optimizer - Maple Help
For the best experience, we recommend viewing online help using Google Chrome or Microsoft Edge.

Online Help

All Products    Maple    MapleSim


DeepLearning

  

Optimizer

  

create an Optimizer object

 

Calling Sequence

Parameters

Description

Supported Optimizers

Details

Examples

Compatibility

Calling Sequence

Optimizer(X)

Parameters

X

-

function; optimizer name with parameters.

Description

• 

The Optimizer(X) command creates an Optimizer using the specified function X with parameters.

• 

An Optimizer computes gradients for a loss function and applies gradients to variables. Among the implemented optimizers are included classic optimization algorithms such as gradient descent and Adagrad.

Supported Optimizers

• 

In the Usage column in the table below, all arguments must be real-valued input parameters. The input learning_rate is mandatory for all optimizers, the remaining parameters may be omitted and will revert to default values.

Distribution

Usage

Description

Gradient Descent

GradientDescent(learning_rate)

Gradient descent algorithm

Adadelta

Adadelta(learning_rate,rho,epsilon)

Adadelta algorithm

Adagrad

Adagrad(learning_rate,initial_accumulator_value)

Adagrad algorithm

Adam

Adam(learning_rate,beta1,beta2)

Adam algorithm

Ftrl

Ftrl(learning_rate,learning_rate_power,initial_accumulator_value,l1_regularization_strength,l2_regularization_strength)

FTRL algorithm

ProximalGradientDescent

ProximalGradientDescent(learning_rate,l1_regularization_strength,l2_regularization_strength)

Proximal gradient descent algorithm

RMSProp

RMSProp(learning_rate,decay,momentum,epsilon)

RMSProp algorithm

• 

This function is part of the DeepLearning package, so it can be used in the short form Optimizer(..) only after executing the command with(DeepLearning). However, it can always be accessed through the long form of the command by using DeepLearning[Optimizer](..).

Details

• 

For more information on any of the above optimizers and the meaning of the input parameters, consult the TensorFlow Python API documentation for tf.train.

Examples

withDeepLearning:

fOptimizerAdagrad0.01

fDeepLearning Optimizer<keras.src.optimizers.adagrad.Adagrad object at 0x7f9f6b5a9c10>

(1)

gOptimizerAdam0.05

gDeepLearning Optimizer<keras.src.optimizers.adam.Adam object at 0x7f9f2ad7c990>

(2)

Compatibility

• 

The DeepLearning[Optimizer] command was introduced in Maple 2018.

• 

For more information on Maple 2018 changes, see Updates in Maple 2018.

See Also

DeepLearning Overview