Home Batch Normalization
Post
Cancel

Batch Normalization

What is Batch Normalization?

It is a way to perform Regularization. Batch normalization normalizes using the mean and variance of the mini-batch, and scaling and shifting are performed using γ values and β values. At this time, γ and β are learnable through back propagation.

Now, let’s see what happens in batch normalization.

Input : Values of x over a mini-batch: B=x1, , xm
Parameters to be learned : γ, β
Output : {yi = BNγ,β(xi)}

μB1mi=1mxi : Mini-batch mean

$\sigma^{2}{B} \leftarrow \frac{1}{m}\sum{i=1}^{m}(x_{i}-\mu_{B})^{2}$

$\hat{x}{i} \leftarrow \frac{x{i}-\mu_{B}}{\sqrt{\sigma^{2}_{B}+\epsilon}}$

$y_{i} \leftarrow \gamma\hat{x}{i} + \beta \equiv BN{\gamma,\beta}(x_{i})$

This post is licensed under CC BY 4.0 by the author.