SmoothReLU Backward Layer

The smooth rectifier linear unit (SmoothReLU) activation layer applies the transform f(x) = log(1 + exp(x)) to the input data. The backward SmoothReLU layer computes the values z = y*f'(x), where y is the input gradient computed on the preceding layer and

Problem Statement

Given p-dimensional tensors X and Y of size n 1 x n 2 x ... x n p , the problem is to compute the p-dimensional tensor Z = (z i 1...i p ) of size n 1 x n 2 x ... x n p such that:

For more complete information about compiler optimizations, see our Optimization Notice.
Select sticky button color: 
Orange (only for download buttons)