Exponential Linear Unit Backward Layer

The forward exponential linear unit layer computes value of the following function



where α is the user-defined real valued coefficient, x is the input of the layer. The backward ELU layer computes the following value



where g is the input gradient tensor obtained from the preceding layer during the back propagation process and z is the value to be propagated to the next layer,

denotes derivative of the forward function



calculated at the point x

Problem Statement

Let

be the input gradient tensor with the components

and let

be the user-defined coefficient. The problem is to compute the tensor

with the components

to be propagated to the next layer



where



for all

For more complete information about compiler optimizations, see our Optimization Notice.
Select sticky button color: 
Orange (only for download buttons)