Installing the Intel® AI Analytics Toollkit with the Conda* Package Manager

This page provides general installation and support notes about the Intel® oneAPI Toolkits as they are distributed for use with the Conda package manager.

LEGAL NOTICE: By downloading and using these packages and the included software, you agree to the terms and conditions of the software license agreements located at the End User License Agreements page.
By proceeding you acknowledge that you have read the EULA and agree to the terms and conditions of this agreement.

NOTE: If you have an existing installation of an older version of Intel® oneAPI, be sure that your working environment is not activated to use the older installation.

NOTE: the Conda repository is a public repository on the general Internet. If you are on a company intranet behind a firewall you may need to set environment variables https_proxy and http_proxy to your company's proxy server and port. Please contact your local network or system administrators for assistance if you are unfamiliar with using proxy servers.

Installing the Intel® AI Analytics Toolkit (AI Kit) with Anaconda*

Intel provides access to the AI Kit through a public Anaconda repository. See below for instructions on how to pull the latest versions of the Intel tools. For more information, visit the Conda User Guide.
Installation using Conda requires an existing Conda-based python environment. You can get such an environment by installing the Intel® Distribution for Python or Miniconda*.

To get more details on the AI Analytics Toolkit, visit the Intel AI Analytics toolkit home page.


The AI Kit contains three distinct python environments targeting different use cases:

  • intel-aikit-tensorflow
    for deep learning workflows using Tensorflow*
  • intel-aikit-pytorch
    for deep learning workflows using PyTorch*
  • intel-aikit-modin
    for data analytics and machine learning workflows using Intel® Distribution of Modin (for accelerated data frames) and Intel optimized scikit-learn and XGboost (for ML training and inference).

In the steps outlined below, substitute the bold package name of your choice from the list of three above with the package name in the example.

1. Activate your existing python conda environment located in <pythonhome>:

source <pythonhome>/bin/activate

2. Install the AI Kit oneAPI beta packages in a new environment using “conda create”. A list of available packages is located at https://anaconda.org/intel/repo. For example, you would use the following to create an AI Kit Tensorflow* environment named “aikit-tf”:

conda create -n aikit-tf -c intel/label/oneapibeta intel-aikit-tensorflow

3. If you want multiple package sets in one environment, simply add both to your command. For example, to install both Tensorflow* and Intel® Modin into one python environment, you would simply add intel-aikit-modin to the previous command:

conda create -n aikit-tf -c intel/label/oneapibeta intel-aikit-tensorflow intel-aikit-modin

4. Set user environment. After the toolkit is installed, before accessing the tools, you must “activate” your python environment set up environment variables to access the tools. For example to activate the python environment created in the previous step you would use:

conda activate aikit-tf

NOTE: To install the Model Zoo for Intel® Architecture component of the toolkit, clone the master branch to your local directory:
git clone https://github.com/IntelAI/models.git
After you have cloned the master branch, check out version 1.6.0, which is the version that is compatible with the Beta08 release of the Intel® AI Analytics toolkit:
cd models
git checkout v1.6.0

NOTE: If you have applications with long-running GPU compute workloads in native environments, you must disable the hangcheck timeout period to avoid terminating workloads.

NOTE: Intel® oneAPI beta packages are available on our “oneapibeta” label on the Anaconda Cloud. You must include “-c intel/label/oneapibeta” on your command line or add “intel/label/oneapibeta” to your Conda configuration file using “conda config --add channels intel/label/oneapibeta”.

Product and Performance Information

1

Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors. These optimizations include SSE2, SSE3, and SSSE3 instruction sets and other optimizations. Intel does not guarantee the availability, functionality, or effectiveness of any optimization on microprocessors not manufactured by Intel. Microprocessor-dependent optimizations in this product are intended for use with Intel microprocessors. Certain optimizations not specific to Intel microarchitecture are reserved for Intel microprocessors. Please refer to the applicable product User and Reference Guides for more information regarding the specific instruction sets covered by this notice.

Notice revision #20110804