Normalization is a set of algorithms intended to transform data before feeding it to some classes of algorithms, e.g. neural networks and classifiers [James2013]. Normalization may improve computation accuracy and efficiency. Different rules can be used to normalize data. In Intel DAAL, two common techniques to normalize data are implemented:
For more complete information about compiler optimizations, see our Optimization Notice.