ReLU Backward Layer

The rectifier linear unit (ReLU) activation layer applies the transform f(x) = max(0, x) to the input data. The backward ReLU layer computes the value z = y * f'(x), where y is the input gradient computed on the prior layer and f'(x) = {1 if x > 0, 0 if x =< 0}.

Problem Statement

Given p-dimensional tensors X and Y of size n1 x n2 x ... x np, the problem is to compute a p-dimensional tensor Z = (zi1...ip) of size n1 x n2 x ... x np, where:

zi1...ip = {yi1...ip if xi1...ip > 0, 0 if xi1...ip =< 0}

For more complete information about compiler optimizations, see our Optimization Notice.
Select sticky button color: 
Orange (only for download buttons)