Getting Started Guide

Contents

Batch Normalization Backward Layer

The forward batch normalization layer normalizes
x
i
1
...
i
p
from the input
X
R
n
1
x
n
2
x ... x
n
p
for the dimension
k
∈ {1, ...
p
} and then scales and shifts the result of the normalization . For more details, see Forward Batch Normalization Layer . The backward batch normalization layer [ Ioffe2015 ] computes the values for the dimension
k
∈ {1, ...
p
}:
where
  • g
    is the gradient of the preceding layer
  • E
    is the objective function used at the training stage.
  • objective function used at the training stage.
  • weights
  • biases
  • mean
  • variance
  • standard deviation

Problem Statement

Given
p
-dimensional tensors:
  • G
    R
    n
    1
    x
    n
    2
    x ... x
    n
    p
    - the gradient computed on the preceding layer
  • Y
    R
    n
    1
    x
    n
    2
    x ... x
    n
    p
    - the output of the forward batch normalization layer
The problem is to compute the
p
-dimensional tensor
Z
R
n
1
x
n
2
x ... x
n
p
such that:
for
j
= 1, ...,
n
k
, where:

Product and Performance Information

1

Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors. These optimizations include SSE2, SSE3, and SSSE3 instruction sets and other optimizations. Intel does not guarantee the availability, functionality, or effectiveness of any optimization on microprocessors not manufactured by Intel. Microprocessor-dependent optimizations in this product are intended for use with Intel microprocessors. Certain optimizations not specific to Intel microarchitecture are reserved for Intel microprocessors. Please refer to the applicable product User and Reference Guides for more information regarding the specific instruction sets covered by this notice.

Notice revision #20110804