Batch Normalization is a technique design to automatic
standardize the inputs to a layer in a deep learning neural network.
Once implemented, batch normalization has the effect of dramatically accelerating the training process of a neural network, and in some cases improves the performance of the model via a modest regularization effect.
BatchNormalization in Keras
Keras provides support for batch normalization via the BatchNormalization layer.
The layer will transform inputs so that they are standardized, meaning that they will have a mean of zero and a standard derivation of one.
During training, the layer will keep track of statistics for each input variable and use them to standardized the data.
BatchNormalization in Models
Batch normalization can be used at most points in a model and with most types of deep learning neural network.
The BatchNormalization is not recommended as an alternative to proper data preparation for model.
The BatchNormalizaton can be used to standardized inputs before or after the activation function of the previous layer.
References:
No comments:
Post a Comment