Types of Layers Implemented
- Activation layers,which apply a transform to the input data.
- Pooling layers,which apply a form of non-linear downsampling to input data.
- i− 1 for forward layers.
- i+1 for backward layers.
- InIntel DAAL, numbering of data samples is scalar.
- For neural network layers, the first dimension of the input tensor represents the data samples.While the actual layout of the data can be different, the access methods of the tensor return the data in the assumed layout. Therefore, for a tensor containing the input to the neural network, it is your responsibility to change logical indexing of tensor dimensions so that the first dimension represents the sample data. To do this, use theshuffleDimensions()method of theTensorclass.
- Several neural network layers listed below support in-place computation, which means the result rewrites the input memory under the following conditions:
The following layers support in-place computation:
- Both the input and the result are represented by a homogenous tensor of floating-point type identical to thealgorithmFPTypetype used by the layer for intermediate computations.
- The input of the layer is unique, that is, it is not shared between multiple layers in the neural network topology (the inputs of the layers that have the split as preceding layer are shared).
- Required for the layers marked with (*) sign.The layers are used on prediction stage only, except the layers in the forward computation step of the training stage.
- Absolute Value (Abs) Forward Layer (*)
- Absolute Value (Abs) Backward Layer
- Logistic Forward Layer (*)
- Logistic Backward Layer
- Rectifier Linear Unit (ReLU) Forward Layer (*)
- Rectifier Linear Unit (ReLU) Backward Layer
- Hyperbolic Tangent Forward Layer (*)
- Hyperbolic Tangent Backward Layer
- Split Forward Layer