DeepLearning
BatchNormalizationLayer
create batch normalization layer
Calling Sequence
Parameters
Options
Description
Details
Examples
Compatibility
BatchNormalizationLayer(dim, opts)
dim
-
positive integer
opts
one or more options as specified below
center : truefalse
If true, the learned offset beta will be added to the normalized tensor. The default is true.
epsilon : numeric
Specifies a small real number number to be added to the variance to avoid division by zero. Default is 0.01.
inputshape : list of integers
Shape of the input Tensor, not including the batch axis.
With the default value auto, the shape is inferred. If inference is not possible, an error is issued.
This option need only be specified when this layer is the first in a Sequential model.
momentum : numeric
Specifies the momentum for the moving average. Default is 0.99.
scale : truefalse
If true, the learned scale factor gamma will be multiplied by the normalized tensor. The default is true.
The BatchNormalizationLayer(dim, opts) command creates a batch normalization neural network layer.
This function is part of the DeepLearning package, so it can be used in the short form BatchNormalizationLayer(..) only after executing the command with(DeepLearning). However, it can always be accessed through the long form of the command by using DeepLearning[BatchNormalizationLayer](..).
The implementation of BatchNormalizationLayer uses tf.keras.layers.BatchNormalization from the TensorFlow Python API. Consult the TensorFlow Python API documentation for tf.keras.layers.BatchNormalization for more information.
with⁡DeepLearning
AddMultiple,ApplyOperation,BatchNormalizationLayer,BidirectionalLayer,BucketizedColumn,CategoricalColumn,Classify,Concatenate,Constant,ConvolutionLayer,DNNClassifier,DNNLinearCombinedClassifier,DNNLinearCombinedRegressor,DNNRegressor,Dataset,DenseLayer,DropoutLayer,EinsteinSummation,EmbeddingLayer,Estimator,FeatureColumn,Fill,FlattenLayer,GRULayer,GatedRecurrentUnitLayer,GetDefaultGraph,GetDefaultSession,GetEagerExecution,GetVariable,GradientTape,IdentityMatrix,LSTMLayer,Layer,LinearClassifier,LinearRegressor,LongShortTermMemoryLayer,MaxPoolingLayer,Model,NumericColumn,OneHot,Ones,Operation,Optimizer,Placeholder,RandomTensor,ResetDefaultGraph,Restore,Save,Sequential,Session,SetEagerExecution,SetRandomSeed,SoftMaxLayer,SoftmaxLayer,Tensor,Variable,Variables,VariablesInitializer,Zeros
model≔Sequential⁡DenseLayer⁡3,BatchNormalizationLayer⁡2
model≔DeepLearning Model<keras.src.engine.sequential.Sequential object at 0x7fef0a2b2950>
model:-Compile⁡
The DeepLearning[BatchNormalizationLayer] command was introduced in Maple 2022.
For more information on Maple 2022 changes, see Updates in Maple 2022.
See Also
DeepLearning Overview
Download Help Document