Get Started with Intel® Distribution for Python*

Use these resources to get up to speed fast.

FAQ

Will Intel Distribution for Python interfere with the system installation of Python?
You can have several versions of Python on your system. Conda* also lets you manage multiple Python environments. By installing Intel Distribution for Python in a conda environment, you ensure that your system installation of Python will not be affected.
Why doesn't virtualenv work?
Virtualenv does not copy a required library, libpython, into the virtual directory. For details and workarounds, see this discussion on GitHub*.

We recommend managing environments with conda instead. For more information, see Conda Create.
How do I make Python use the Intel® C Compiler and Intel® C++ Compiler?

To set environment variables with Bash*, do one of the following:

For macOS* and Linux*:
export CC=icc
export LD_SHARED="icc -shared"

For Windows*:
set CC=icl
set LD=xilink

How do I compile my native extension with the Intel C Compiler and Intel C++ Compiler?

To set environment variables with Bash, do one of the following:

For macOS and Linux:
export CC=icc
export LD_SHARED="icc -shared"

For Windows:
set CC=icl
set LD=xilink
export CXX=icpc

Do you provide a package that lets me call Intel® Math Kernel Library (Intel® MKL) functions directly?

Not directly. Many functions in NumPy and SciPy are implemented with Intel MKL functions. Since Intel Distribution for Python has the same shared libraries and functions as Intel MKL, you can build your own C extensions that link to the functions. For details, see the Intel MKL Documentation.

Can I redistribute applications that use the Intel Distribution for Python?

Yes. See INTEL_PYTHON_EULA and redist.txt in the install directory for details. This document is downloaded as part of your installation.

What are the –devel and –static packages and how do I use them when compiling my applications?

The –devel and –static packages can be used by developers building applications that require linking to Intel runtimes included In Intel® Distribution for Python*. For example, a developer may choose to build their own NumPy package with Intel MKL routines.

Is the –devel package included in Intel® Parallel Studio XE different than one I can get from Anaconda Cloud? If so, how do I know which one to use for my application?

If you plan to compile against library versions in your Intel Parallel Studio XE installation, you should use the –devel packages included therein.

Starting with Intel Parallel Studio XE 2019, a local conda channel is included at the Intel Parallel Studio XE root (<psxe_installdir>/conda_channel) that contains conda packages of all Intel® Performance Libraries. Installing the libraries (with conda install mkl-devel) from that location links the libraries in Intel Parallel Studio XE into the Python environment directly from the canonical installation. This method ensures consistent library versions. We've added this location to the .condarc file in the intelpython env (<psxe_installdir>/intelpython), so it should work out of the box.

Intel MKL, Intel® Integrated Performance Primitives (Intel® IPP), and Intel® Data Analytics Acceleration Library (Intel® DAAL) development versions have been included in various forms for developer use. They are located in the <python_root>/pkgs directory and come as conda packages with the following naming scheme (where PROD is the product name):

PROD: shared libraries needed at runtime

PROD-include: header files

PROD-devel: shared libraries plus header files, used for building using dynamic linking

PROD-static: shared libraries plus header files, used for building using static linking

Intel® Tech.Decoded

Gain insight on what's ahead with software—from parallel programming and HPC to data science and computer vision. Access online training, webinars, and quick-tips to help you learn and master Intel® Software Development Products.