Installing the Intel® AI Analytics Toolkit with the Conda* Package Manager

This page provides general installation and support notes about the Intel® oneAPI Toolkits as they are distributed for use with the Conda package manager.

LEGAL NOTICE: By downloading and using these packages and the included software, you agree to the terms and conditions of the software license agreements located at the End User License Agreements page.
By proceeding you acknowledge that you have read the EULA and agree to the terms and conditions of this agreement.

NOTE: If you have an existing installation of an older version of Intel® oneAPI, be sure that your working environment is not activated to use the older installation.

NOTE: the Conda repository is a public repository on the general Internet. If you are on a company intranet behind a firewall you may need to set environment variables https_proxy and http_proxy to your company's proxy server and port. Please contact your local network or system administrators for assistance if you are unfamiliar with using proxy servers.

Installing the Intel® AI Analytics Toolkit (AI Kit) with Anaconda*

Intel provides access to the AI Kit through a public Anaconda repository. See below for instructions on how to pull the latest versions of the Intel tools. For more information, visit the Conda User Guide.
Installation using Conda requires an existing Conda-based python environment. You can get such an environment by installing the Intel® Distribution for Python or Miniconda*.

To get more details on the AI Analytics Toolkit, visit the Intel AI Analytics toolkit home page.


The AI Kit contains three distinct python environments targeting different use cases:

  • intel-aikit-tensorflow
    for deep learning workflows using Intel® Optimization for TensorFlow*
  • intel-aikit-pytorch
    for deep learning workflows using Intel® Optimization for PyTorch*
  • intel-aikit-modin
    for data analytics and machine learning workflows using Intel® Distribution of Modin (for accelerated data frames) and Intel optimized scikit-learn and XGboost (for ML training and inference).

In the steps outlined below, substitute the bold package name of your choice from the list of three above with the package name in the example.

1. Activate your existing python conda environment located in <pythonhome>:

source <pythonhome>/bin/activate

2. Install the AI Kit oneAPI packages in a new environment using “conda create”. A list of available packages is located at https://anaconda.org/intel/repo. For example, you would use the following to create an AI Kit Tensorflow* environment named “aikit-tf” or an Intel:

conda create -n aikit-tf -c intel intel-aikit-tensorflow

Similarly, you can create an AI Kit PyTorch environment named "aikit-pt":

conda create -n aikit-pt -c intel intel-aikit-pytorch

3. If you want multiple package sets in one environment, simply add both to your command. For example, to install both Tensorflow* and daal4py into one python environment, you would simply add intel-daal4py to the previous command:

conda create -n aikit-tf -c intel intel-aikit-tensorflow intel-daal4py

To install the Intel Distribution of Modin, alter the command to include conda forge dependencies:
conda create -n aikit-modin intel-aikit-modin -c intel -c conda-forge

4. Set user environment. After the toolkit is installed, before accessing the tools, you must “activate” your python environment set up environment variables to access the tools. For example to activate the python environment created in the previous step you would use:

conda activate aikit-tf

NOTE: To install the Model Zoo for Intel® Architecture component of the toolkit, clone the master branch to your local directory:
git clone https://github.com/IntelAI/models.git

NOTE: If you have applications with long-running GPU compute workloads in native environments, you must disable the hangcheck timeout period to avoid terminating workloads.

NOTE: Intel® packages are available on our “intel” label on the Anaconda* Cloud. You must include “-c intel” on your command line as in the examples above, or add “intel” to your Conda configuration file using “conda config --add channels intel”.

Product and Performance Information

1

Performance varies by use, configuration and other factors. Learn more at www.Intel.com/PerformanceIndex.