Intel® MKL with NumPy, SciPy, MATLAB, C#, Python, NAG and More

By Gennady Fedorov, Published: 05/08/2012, Last Updated: 12/26/2017

The following table lists links to the useful articles describing how Intel® Math Kernel Library (Intel® MKL) can be used with the third party libraries and applications.

 

Topics Description Third Party Application/Tool
NumPy/SciPy with Intel® MKL This article intends to help current NumPy/SciPy users to take advantage of Intel® Math Kernel Library (Intel® MKL). Numpy/Scipy
Using Intel® MKL in R This article shows how to configure R to use the optimized BLAS and LAPACK in the Intel® Math Kernel Library (Intel® MKL). R
Using Intel® MKL in Gromacs This article helps the current Gromacs* users get better performance by utilizing the Intel® Math Kernel Library (Intel® MKL). It explains how to build 64-bit Gromacs* with Intel MKL for Intel® 64 based applications. Gromacs*
Using Intel® MKL in GNU Octave This article helps the current GNU Octave* users to incorporate the latest versions of the Intel® Math Kernel Library (Intel® MKL)  on Linux* platforms on Intel® Xeon® *processor-based systems. Octave*
Using Intel MKL BLAS and LAPACK with PETSc This article describes how to build the Portable Extensible Toolkit for Scientific Computation (PETSc) with the Intel® Math Kernel Library (Intel® MKL) BLAS and LAPACK. PETSc
Using Intel® MKL in your Python program This article describes how to use the Intel® Math Kernel Library (Intel® MKL) from a Python* program Python*
Performance hints for WRF on Intel® architecture This article explains how to configure the Weather Research & Forecasting (WRF) run-time environment to achieve the best performance and scalability on Intel® architecture with Intel® software tools. Weather Research & Forecasting (WRF) Application

HPL application note

Use of Intel® MKL in High Performance Computing Challenge (HPCC) benchmark

These guides help the current HPL (High Performance LINPACK) users get better benchmark performance by utilizing Intel® Math Kernel Library (Intel® MKL) BLAS High Performance Computing Challenge benchmarks

Using Intel MKL with MATLAB

Using Intel® MKL in MATLAB Executable (MEX) Files

These guides help the Intel® Math Kernel Library (Intel® MKL) customers to use the latest version of Intel® MKL for Windows* OS with the MathWorks* MATLAB*. MATLAB*
Using Intel® MKL with IMSL* Fortran numerical library This article explains how to use the latest version of the Intel® Math Kernel Library (Intel® MKL) with IMSL* Fortran Numerical Library Version 6.0.0 on Intel® architecture systems under Microsoft Windows* systems IMSL* Fortran
Using Intel® MKL with the NAG* libraries This article describes how to use the Intel® Math Kernel Library (Intel® MKL) with the NAG* libraries. Currently, Intel MKL is used for NAG's BLAS and LAPACK functionalities, with the addition of FFTs for NAG's Fortran SMP Libraries. NAG* libraries

Using Intel® MKL in your C# program

Some more additional tips "How to call MKL from your C# code"

These articles describe how to call and link the Intel® Math Kernel Library (Intel® MKL) functions from your C# code. Examples are provided for calling the BLAS, LAPACK, DFTI (the FFT interface), the PARDISO direct sparse solver, and the vector math library (VML). C#
How do I use Intel® MKL with Java*? The Intel® Math Kernel Library (Intel® MKL) package contains a set of examples that demonstrate the use of Intel MKL functions with Java*. The Java example set includes Java Native Interface (JNI) wrappers. These Java examples are intended for tutorial use only. Java*
C++ template math libraries This article provides information about existing high-level C++ APIs available to invoke MKL functionality.  
     


 

Product and Performance Information

1

Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors. These optimizations include SSE2, SSE3, and SSSE3 instruction sets and other optimizations. Intel does not guarantee the availability, functionality, or effectiveness of any optimization on microprocessors not manufactured by Intel. Microprocessor-dependent optimizations in this product are intended for use with Intel microprocessors. Certain optimizations not specific to Intel microarchitecture are reserved for Intel microprocessors. Please refer to the applicable product User and Reference Guides for more information regarding the specific instruction sets covered by this notice.

Notice revision #20110804