Getting Started Guide

Contents

Exponential Linear Unit Backward Layer

The forward exponential linear unit layer computes value of the following function
where
α
is the user-defined real valued coefficient,
x
is the input of the layer. The backward ELU layer computes the following value
where
g
is the input gradient tensor obtained from the preceding layer during the back propagation process and
z
is the value to be propagated to the next layer,
denotes derivative of the forward function
calculated at the point
x

Problem Statement

Let
be the input gradient tensor with the components
and let
be the user-defined coefficient. The problem is to compute the tensor
with the components
to be propagated to the next layer
where
for all

Product and Performance Information

1

Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors. These optimizations include SSE2, SSE3, and SSSE3 instruction sets and other optimizations. Intel does not guarantee the availability, functionality, or effectiveness of any optimization on microprocessors not manufactured by Intel. Microprocessor-dependent optimizations in this product are intended for use with Intel microprocessors. Certain optimizations not specific to Intel microarchitecture are reserved for Intel microprocessors. Please refer to the applicable product User and Reference Guides for more information regarding the specific instruction sets covered by this notice.

Notice revision #20110804