Install Intel® Distribution of OpenVINO™ toolkit for Windows* 10

This guide applies to Microsoft Windows* 10 64-bit. For Linux* OS information and instructions, see the Installation Guide for Linux

Introduction

Important:
- All steps in this guide are required, unless otherwise stated.
- In addition to the download package, you must install dependencies and complete configuration steps.

Your installation is complete when these are all completed:

  1. Install the Intel® Distribution of OpenVINO™ toolkit core components
  2. Install the dependencies:
  3. Set Environment Variables
  4. Configure the Model Optimizer
  5. Run two demos
  6. Optional:

About the Intel® Distribution of OpenVINO™ toolkit

The Intel® Distribution of OpenVINO™ toolkit speeds the deployment of applications and solutions that emulate human vision. Based on Convolutional Neural Networks (CNN), the toolkit extends computer vision (CV) workloads across Intel® hardware to maximize performance.

The Intel® Distribution of OpenVINO™ toolkit includes the Intel® Deep Learning Deployment Toolkit. For more information, see the Intel® Distribution of OpenVINO™ toolkit Overview page.

The Intel® Distribution of OpenVINO™ toolkit for Windows* 10 OS:

  • Enables CNN-based deep learning inference on the edge
  • Supports heterogeneous execution across Intel® CPU, Intel® Processor Graphics (GPU), Intel® Movidius™ Neural Compute Stick, Intel® Neural Compute Stick 2, and Intel® Vision Accelerator Design with Intel® Movidius™ VPUs
  • Speeds time-to-market through an easy-to-use library of computer vision functions and pre-optimized kernels
  • Includes optimized calls for computer vision standards including OpenCV*, OpenCL™, and OpenVX*

Included in the Installation Package

The following components are installed by default:

ComponentDescription
Model Optimizer

This tool imports, converts, and optimizes models that were trained in popular frameworks to a format usable by Intel tools, especially the Inference Engine. 

NOTE: Popular frameworks includes such frameworks as Caffe*, TensorFlow*, MXNet*, and ONNX*.

Inference EngineThis is the engine that runs the deep learning model. It includes a set of libraries for an easy inference integration into your applications.
OpenCVOpenCV community version compiled for Intel® hardware. Includes PVL libraries for computer vision.
OpenVX*Intel's implementation of OpenVX* optimized for running on an Intel CPU, GPU, or IPU (Image processing unit).
Pre-trained modelsA set of Intel's pre-trained models for learning and demo purposes or to develop deep learning software.
Sample ApplicationsA set of simple console applications demonstrating how to use Intel's Deep Learning Inference Engine in your applications. For additional information about building and running the samples, refer to the Inference Engine Developer Guide.

System Requirements

Only the Intel® CPU, Intel® Processor Graphics, Intel® Movidius™ Neural Compute Stick, Intel® Neural Compute Stick 2 and Intel® Vision Accelerator Design with Intel® Movidius™ VPUs options are supported for the Windows* installation. Linux* is required to use the FPGA.

Hardware

  • 6th-8th Generation Intel® Core™
  • Intel® Xeon® v5 family
  • Intel® Xeon® v6 family
  • Intel® Movidius™ Neural Compute Stick
  • Intel® Neural Compute Stick 2
  • Intel® Vision Accelerator Design with Intel® Movidius™ VPUs

Processor Notes:

  • Processor graphics are not included in all processors. See Processor specifications for information about your processor.

  • A chipset that supports processor graphics is required if you're using an Intel Xeon processor. See Chipset specifications for information about your chipset.

Operating System

Microsoft Windows* 10 64-bit


Installation Steps

Install the Intel® Distribution of OpenVINO™ toolkit Core Components

  1. If you have not downloaded the Intel® Distribution of OpenVINO™ toolkit, download the latest version. By default, the file is saved to the Downloads directory as w_openvino_toolkit_p_<version>.exe.
  2. Go to the Downloads folder.
  3. Double-click w_openvino_toolkit_p_<version>.exe. A window opens to let you choose your installation directory and components. The default installation directory is C:\Intel. If you choose a different installation directory, the installer will create the directory for you.

    OpenVINO for Windows Installation screen
     
  4. Click Next.
  5. You are asked if you want to provide consent to gather information. Choose the option of your choice. Click Next.
  6. If you are missing external dependencies, you will see a warning screen. Write down the dependencies you are missing. You need to take no other action at this time. After installing the Intel® Distribution of OpenVINO™ toolkit core components, you will be provided instructions to install the missing dependencies.
    The screen example below indicates you are missing two dependencies:

    OpenVINO Installation Prerequisites screen
     
  7. Click Next.
  8. When the first part of installation is complete, the final screen informs you that the core components have been installed and additional steps still required:

    OpenVINO for Windows Installation Complete screen
     
  9. Click Finish to close the installation wizard. A new browser window opens to the next section of the installation guide to install the dependencies. You are in the same document. The new window opens in case you ran the installation without first opening this installation guide. If the installation did not indicate you must install dependencies, you can skip ahead to Configure the Model Optimizer.

Install the External Software Dependencies

If the installation process indicated if you are missing dependencies, you must install each missing dependency. Click the link for the first dependency you must install:

If you have no missing dependencies, skip ahead to Configure the Model Optimizer.

Microsoft Visual Studio* with C++ and MSBuild

Microsoft Visual Studio with Visual Studio C++ is required for building the Intel® Deep Learning Deployment Toolkit samples and demonstration applications. You can install the free Community version of Microsoft Visual Studio

IMPORTANT:The Microsoft Visual Studio dependency is a two-part installation that consists of Microsoft Visual Studio 2017 or 2015 and the Microsoft Visual Studio Build Tools. This guide includes steps for installing both parts of this dependency. These are separate installations. MAKE SURE YOU INSTALL BOTH COMPONENTS.

The steps below apply to Microsoft Visual Studio 2017. If you prefer to use Microsoft Visual Studio 2015, see Installing Microsoft Visual Studio* 2015 for Intel® Distribution of OpenVINO™ toolkit.

  1. Go to the Visual Studio downloads page.
  2. Click Free Download in the Visual Studio 2017 box, Community section:

    Select Visual Studio 2017 Community to download
     
    An executable file named vs_community__313888930.1524151023.exe, or similar, is saved in your Downloads folder.
  3. Double-click the executable file to launch the Visual Studio Community 2017 installer.
  4. From the Workloads tab,  use the check boxes to select Universal Windows Platform development and Desktop development with C++

    Select Visual Studio 2017 Tools to Install
     
  5. Under the Individual components tab, select MSBuild:

    Select Individual Components to install

    The Summary at the right side of the screen displays your installation selections:

    Summary of selected components
     
  6. Make no other changes. Click Next. The installation begins, and takes around 30 minutes to complete.
  7. If you see a prompt to restart your computer after the installation completes, dismiss it.

Continue to the next section to install the Build Tools for Visual Studio 2017.

Install the Build Tools for Visual Studio 2017

The Build Tools for Visual Studio 2017 is the second part of the Microsoft Visual Studio dependency. You must complete this installation.

  1. Go to the Tools for Visual Studio 2017 section of the Microsoft Visual Studio Downloads page. 
  2. Click the Download button next to Build tools for Visual Studio 2017:

    Download Visual Studio 2017 Build Tools
     
  3. An executable file named vs_buildtools.exe, or similar, is saved in your Downloads folder.
  4. Double-click the file to install Build Tools for Visual Studio 2017. 
  5. The installation opens to the Workloads tab. Select Visual C++ build tools:

    Select the build tools to install

    The Summary on the right side shows the features you chose to install:

    Build tool summary
     
  6. Click Install.
  7. When the installation completes, restart your computer if prompted to do so.

You have completed the Visual Studio 2017 installation.

Install your next dependency:

Or if you have installed all the dependencies, you are ready to configure the Model Optimizer.

Install CMake* 3.4 or higher

These steps guide you through installing CMake 3.4 or higher, which is required to build the Intel® Distribution of OpenVINO™ toolkit samples. 

  1. Go to the CMake download site.
  2. Under the heading Get the Software, click the link for latest stable in the first paragraph. Your screen displays Latest Release information. 
  3. Scroll down to the line Windows win64-x64 Installer.
  4. Click the associated file name to download the installer. The file name will have the extension .msi. The file is saved to your Downloads folder.
  5. Go to the Downloads folder.
  6. Double-click the file to launch the installer.

    Note: If you have a previous version of CMake installed, you are prompted to uninstall it. You must uninstall the previous version before installing the new version. Follow the instructions on the screen and then launch the installer again to install the new version.

  7. In the installer, select the option to Add CMake to the system PATH for all users:

    CMake Installation Options
     
  8. Click Next.
  9. Click Finish when the installation completes.

You have completed the CMake installation. Next, install Python 3.6.5 if the Intel® Distribution of OpenVINO™ toolkit installation indicated you are missing the software.

Install Python* 3.6.5

Python 3.6.5 with pip is required to run the Model Optimizer. Use these steps to install the correct version of the Python software. 

  1. Go to the Python 3.6.5 download page and click Windows x86-64 executable installer to download the executable file. The file is saved as python-3.6.5-amd64.exe in your Downloads folder.

    Browse to download Python
     
  2. Double-click the file to launch the installation.
  3. Make sure the top of the screen shows Python 3.6.5 (64-bit).
  4. IMPORTANT: At the bottom of the install screen, select Add Python 3.6 to PATH.
    Install Python and select environment variable option
  5. Click Install Now near the top of the install screen and let the installation complete.
  6. When the installation finishes, click Close.

You have completed the Python installation and are ready to set environment variables. Continue to the next section.

 

Set the environment variables 

You must update several environment variables before you can compile and run OpenVINO™ applications. Open the Command Prompt and run the following batch file to temporarily set your environment variables:

C:\Intel\computer_vision_sdk\bin\setupvars.bat

(Optional): OpenVINO toolkit environment variables are removed when you close the Command Prompt window. As an option, you can permanently set the environment variables manually.

The environment variables are set. Continue to the next section to configure the Model Optimizer.

 

Configure the Model Optimizer

Important: These steps are required. You must configure the Model Optimizer for at least one framework. The Model Optimizer will fail if you do not complete the steps in this section.

If you see an error indicating Python is not installed when you know you installed it, your computer might not be able to find the program. For instructions to add Python to your system environment variables, see Update Your Windows Environment Variables.

The Model Optimizer is a key component of the Intel® Distribution of OpenVINO™ toolkit. You cannot do inference on your trained model without running the model through the Model Optimizer. When you run a pre-trained model through the Model Optimizer, your output is an Intermediate Representation (IR) of the network. The IR is a pair of files that describe the whole model:

  • .xml: Describes the network topology
  • .bin: Contains the weights and biases binary data

The Inference Engine reads, loads, and infers the IR files, using a common API across the CPU, GPU, or VPU hardware.

The Model Optimizer is a Python*-based command line tool (mo.py), which is located in C:\Intel\computer_vision_sdk_<version>\deployment_tools\model_optimizer, where <version> is the version of the Intel® Distribution of OpenVINO™ toolkit that you installed. Use this tool on models trained with popular deep learning frameworks such as Caffe, TensorFlow, MXNet, and ONNX to convert them to an optimized IR format that the Inference Engine can use.

This section explains how to use scripts to configure the Model Optimizer either for all of the supported frameworks at the same time or for individual frameworks. If you want to manually configure the Model Optimizer instead of using scripts, see the Using Manual Configuration Process section in the Model Optimizer Developer Guide.

For more information about the Model Optimizer, see the Model Optimizer Developer Guide

Model Optimizer Configuration Steps

You can configure the Model Optimizer either for all supported frameworks at once or for one framework at a time. Choose the option that best suits your needs. If you see error messages, make sure you installed all dependencies.

Note: These steps use a command prompt to make sure you see error messages. 

In the steps below:
- Replace <version> with the version number of your Intel® Distribution of OpenVINO™ toolkit
- If you did not install Intel® Distribution of OpenVINO™ toolkit to the default installation directory, replace \Intel\ with the directory where you installed the software. 

Option 1: Configure the Model Optimizer for all supported frameworks at the same time:

  1. Open a command prompt. To do so, type the following in your Search Windows box and then press Enter:
    cmd

    Type commands in the opened window:

    Windows command prompt

  2. Go to the Model Optimizer prerequisites directory. Remember to replace <version> with the version of the Intel® Distribution of OpenVINO™ toolkit that you installed: 
    cd C:\Intel\computer_vision_sdk_<version>\deployment_tools\model_optimizer\install_prerequisites
    
  3. Run the following batch file to configure Model Optimizer for Caffe*, TensorFlow*, MXNet*, Kaldi*, and ONNX*:
    install_prerequisites.bat

Option 2: Configure the Model Optimizer for each framework separately:

  1. Go to the Model Optimizer prerequisites directory:
    cd C:\Intel\computer_vision_sdk_<version>\deployment_tools\model_optimizer\install_prerequisites
  2. Run the batch file for the framework you will use with the Model Optimizer. You can use more than one:
    • For Caffe:
      install_prerequisites_caffe.bat
    • For TensorFlow:
      install_prerequisites_tf.bat
    • For MXNet:
      install_prerequisites_mxnet.bat
    • For ONNX:
      install_prerequisites_onnx.bat
    • For Kaldi:
      install_prerequisites_kaldi.bat

The Model Optimizer is configured for one or more frameworks. Success is indicated by a screen similar to this:

Model Optimizer configured

You are ready to use two short demos to see the results of running the Intel Distribution of OpenVINO toolkit and to verify your installation was successful. The demo scripts are required since they perform additional configuration steps. Continue to the next section.

If you want to use a GPU or VPU, or update your Windows* environment variables, read through the Optional Steps section.

 

Use the Demo Scripts to Verify Your Installation

Important: This section is required. In addition to confirming your installation was successful, demo scripts perform other steps, such as setting up your computer to use the Model Optimizer samples.

Note: To run the demo applications on Intel® Processor Graphics, Intel® Movidius™ Neural Compute Stick or Intel® Neural Compute Stick 2, make sure you completed the Additional Installation Steps first.

To learn more about the demo applications, see README.txt in C:\Intel\computer_vision_sdk_<version>\deployment_tools\demo\.

For detailed description of the pre-trained object detection and object recognition models, go to C:\Intel\computer_vision_sdk_<version>\deployment_tools\intel_models\ and open index.html.

Notes:
- The paths in this section assume you used the default installation directory. If you used a directory other than C:\Intel, update the directory with the location where you installed the software.
- If you are migrating from the Intel® Computer Vision SDK 2017 R3 Beta version to the Intel® Distribution of OpenVINO™ toolkit, read this information about porting your applications.

  1. Open a command prompt window. 
     
  2. Go to the Inference Engine demo directory:
    cd C:\Intel\computer_vision_sdk_<version>\deployment_tools\demo\
  3. Run the demos by following the instructions in the next two sections.

Run the Image Classification Demo

This demo serves two purposes:

  • It creates a directory named build_<version> in C:\Intel\computer_vision_sdk_<version>\deployment_tools\inference_engine\samples.
  • It uses the Model Optimizer to convert a SqueezeNet model to .bin and .xml Intermediate Representation (IR) files that are used by the Inference Engine.

For a brief description of the Intermediate Representation .bin and .xml files, see Configuring the Model Optimizer.

For more information about the Inference Engine, see the Inference Engine Developer Guide.

  1. Run the Image Classification demo:
    demo_squeezenet_download_convert_run.bat
  2. This demo uses the car.png image located in the C:\Intel\computer_vision_sdk_<version>\deployment_tools\demo directory. When the demo completes, the label and confidence for the top-10 categories are displayed on your screen:
    Image Classification Demo results

    This demo is complete. Leave the console open and continue to the next section to run the Inference Pipeline demo.

Run the Inference Pipeline Demo

  1. While still in the C:\Intel\computer_vision_sdk_<version>\deployment_tools\demo\ directory, run the Inference Pipeline demo:
    demo_security_barrier_camera.bat
    This demo uses the car.png image located in C:\Intel\computer_vision_sdk_<version>\deployment_tools\demo\ to show an inference pipeline. This demo uses three pre-trained models. The demo uses vehicle recognition in which vehicle attributes build on each other to narrow in on a specific attribute. The demo works as follows:
    1. An object is identified as a vehicle.
    2. This identification is used as input to the next model, which identifies specific vehicle attributes, including the license plate.
    3. The attributes identified as the license plate are used as input to the third model, which recognizes specific characters in the license plate.

    For more information, see the Security Camera Sample.​
  2. When the demo completes, you have two windows open:
    • A console window that displays information about the tasks performed by the demo.
    • An image viewer window that displays a picture similar to the following:


       
  3. Close the image viewer window to end the demo.

In this section, you saw a preview of the Intel® Distribution of OpenVINO™ toolkit capabilities.

You have completed all the required installation, configuration, and build steps to work with your trained models using CPU.

If you want to use Intel® Processor graphics (GPU), Intel® Movidius™ Neural Compute Stick or Intel® Neural Compute Stick 2 (VPU), or add CMake* and Python* to your Windows* environment variables, read through the next section for additional steps

Read the Summary for your next steps.

 

Optional Steps

Use the optional steps below if you want to:

Use these steps to install the Intel Graphics Driver for Windows that is required for using processor graphics (GPU), if you want to infer models on Intel® Movidius™ Neural Compute Stick or Intel® Neural Compute Stick 2, or to add CMake* or Python* to your Windows* environment variables

If you prefer to use Microsoft Visual Studio 2015 instead of Microsoft Visual Studio 2017 to build your sample applications, refer to Installing Microsoft Visual Studio* 2015 for Intel® Distribution of OpenVINO™ toolkit.

Optional: Additional Installation Steps for Intel® Processor Graphics (GPU)

These steps are required only if you want to use a GPU.

If your applications offload computation to Intel® Integrated Graphics, you must have the Intel Graphics Driver for Windows version 15.65 or higher. To see if you have this driver installed:

  1. Type device manager in your Search Windows box. The Device Manager opens. 
  2. Click the drop-down arrow to view the Display adapters. You see the adapter that is installed in your computer:

    Device Manager
     
  3. Right-click the adapter name and select Properties
  4. Click the Driver tab to see the driver version. Make sure the version number is 15.65 or higher.

    Device Driver Version
  5. If your device driver version is lower than 15.65, download and install a higher version.

You are done updating your device driver and are ready to use your GPU. 

Optional: Additional Installation Steps for Intel® Movidius™ Neural Compute Stick and Neural Compute Stick 2 

Note: These steps are only required if you want to perform inference on Intel® Movidius™ Neural Compute Stick powered by the Intel® Movidius™ Myriad™ 2 VPU or the Intel® Movidius™ Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X VPU. See also Intel® Movidius™ Neural Compute Stick 2 Get Started

For Intel® Movidius™ Neural Compute Stick and Intel® Neural Compute Stick 2, the OpenVINO™ toolkit provides the Movidius™ VSC driver. To install the driver:

  1. Go to the <INSTALL_DIR>\deployment_tools\inference-engine\external\MovidiusDriver\ directory, where <INSTALL_DIR> is the directory in which the Intel Distribution of OpenVINO toolkit is installed.
  2. Right click on the Movidius_VSC_Device.inf file and choose Install from the pop up menu:
    Install NCS driver

You have installed the driver for your Intel® Movidius™ Neural Compute Stick or Intel® Neural Compute Stick 2.

Optional: Additional Installation Steps for the Intel® Vision Accelerator Design with Intel® Movidius™ VPUs 

Note: These steps are required only if you want to use Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.

To perform inference on Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, the following additional installation steps are required:

  1. Install the Movidius™ VSC driver:
    1. Go to the <INSTALL_DIR>\deployment_tools\inference-engine\external\ directory, where <INSTALL_DIR> is the directory in which the OpenVINO™ toolkit is installed.
    2. Right click on the Movidius_VSC_Device.inf file and choose Install from the pop up menu.
  2. If your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs card requires SMBUS connection to PCIe slot (Raw video data card with HW version Fab-B and before), install the SMBUS driver:
    1. Go to the <INSTALL_DIR>\deployment_tools\inference-engine\external\hddl\SMBusDriver directory, where <INSTALL_DIR> is the directory in which the OpenVINO™ toolkit is installed.
    2. Right click on the hddlsmbus.inf file and choose Install from the pop up menu.
  3. Download and install Visual C++ Redistributable for Visual Studio 2015

You are done installing your device driver and are ready to use your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
 

Optional: Update Your Windows Environment Variables

These steps are only required under special circumstances, such as if you forgot to check the box during the CMake* or Python* installation to add the application to your Windows PATH environment variable.

Use these steps to update your Windows PATH if a command you execute returns an error message stating that an application cannot be found. This might happen if you do not add CMake or Python to your PATH environment variable during the installation.

  1. In your Search Windows box, type Edit the System Environment Variables and press Enter. A window similar to the following displays:

    System properties screen
     
  2. At the bottom of the screen, click Environment Variables.
  3. Under System variables, click Path and then Edit:

    Select option to set environment variable
     
  4. In the opened window, click Browse. A browse window opens:

    Browse to executable 
     
  5. If you need to add CMake to the PATH, browse to the directory in which you installed CMake. The default directory is C:\Program Files\CMake.
  6. If you need to add Python to the PATH, browse to the directory in which you installed Python. The default directory is C:\Users\<USER_ID>\AppData\Local\Programs\Python\Python36\Python. 
  7. Click OK repeatedly to close each screen.

Your PATH environment variable is updated. 

 

Summary

In this document, you installed the Intel® Distribution of OpenVINO™ toolkit and its dependencies. You also configured the Model Optimizer for one or more frameworks. After the software was installed and configured, you ran two demo applications. You might have also installed drivers that will let you use a GPU or VPU to infer your models.

You are now ready to learn more about converting models trained with popular deep learning frameworks to the Inference Engine format, following the links below, or you can move on to running the sample applications

To learn more about converting deep learning models, go to:


Additional Resources

Intel® Distribution of OpenVINO™ toolkit home page: https://software.intel.com/en-us/openvino-toolkit

Intel® Distribution of OpenVINO™ toolkit documentation: https://software.intel.com/en-us/openvino-toolkit/documentation/featured

Intel Distribution of OpenVINO Toolkit Hello World Activities, see the Inference Tutorials for Face Detection and Car Detection Exercises.

Intel® Neural Compute Stick 2 Get Started page: https://software.intel.com/en-us/neural-compute-stick/get-started

 


Legal Information

You may not use or facilitate the use of this document in connection with any infringement or other legal analysis concerning Intel products described herein. You agree to grant Intel a non-exclusive, royalty-free license to any patent claim thereafter drafted which includes subject matter disclosed herein.

No license (express or implied, by estoppel or otherwise) to any intellectual property rights is granted by this document.

All information provided here is subject to change without notice. Contact your Intel representative to obtain the latest Intel product specifications and roadmaps.

The products described may contain design defects or errors known as errata which may cause the product to deviate from published specifications. Current characterized errata are available on request.

Intel technologies' features and benefits depend on system configuration and may require enabled hardware, software or service activation. Learn more at http://www.intel.com/ or from the OEM or retailer.

No computer system can be absolutely secure.

Intel, Arria, Core, Movidius, Pentium, Xeon, OpenVINO, and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.

OpenCL and the OpenCL logo are trademarks of Apple Inc. used by permission by Khronos

*Other names and brands may be claimed as the property of others.

Copyright © 2018, Intel Corporation. All rights reserved.

 
For more complete information about compiler optimizations, see our Optimization Notice.