Get Started with Intel® Neural Compute Stick 2

签署人: Neal P Smith Andrew Herrold

已发布:04/25/2019   最后更新时间:03/28/2019

Follow the step-by-step instructions below to setup your Intel® Neural Compute Stick 2 (Intel® NCS 2) or the original Intel® Movidius™ NCS.  Also, check out the getting started videos for your platform:

Start: Linux*

Start: Raspbian*

Start: Windows

Step 1. Gather Your Equipment

 Intel Neural Compute Stick 2

Before you start, make sure you have the following:

  • An x86-64 host computer with Windows 10® or Ubuntu* (16.04 or 18.04) for the Intel® Distribution of OpenVINO™ Toolkit.
  • Intel® Neural Compute Stick 2 (Intel® NCS 2). Buy Now
  • An internet connection to download and install the Intel® Distribution of OpenVINO™ toolkit.

Note This article is based on the 2019 R1 release of the Intel® Distribution of OpenVINO™ toolkit.

Note To use a Raspberry Pi as the host for the Intel® NCS 2, it’ s recommended that you still follow the getting started instructions on this page to install the full Intel® Distribution of the OpenVINO™ toolkit on one of the supported platforms (Windows, Linux*) and then install the Inference Engine on your Raspberry Pi and review the Raspberry Pi workflow.

Migrate your projects: The Intel® Distribution of OpenVINO™ toolkit also supports the original Intel® Movidius™ NCS device. Existing Intel® Movidius™ Neural Compute SDK (NCSDK) projects can be transitioned to The Intel® Distribution of OpenVINO™ toolkit.

Step 2. Install the OpenVINO™ Toolkit

Download the appropriate version of the Intel® Distribution of the OpenVINO™ toolkit for your host computer (Windows or Linux.). This guide assumes the full package installation is downloaded. Then, follow the installation instructions for your OS and customize the installation as shown in the image below.

recommended customization

For Linux*

Run the following commands:

cd ~/Downloads 
tar xvf l_openvino_toolkit_<VERSION>.tgz
cd l_openvino_toolkit_<VERSION>
sudo -E ./ 

For Windows*

Double click the downloaded w_openvino_toolkit_<VERSION>.exe file.

Follow the on-screen instructions to continue the installation process and customize the installation as desired. Install any dependencies shown during the installation prior to continuing to step 3.

Return to this page and continue with Step 3 after successful installation to assure your installation is correct.

Step 3. Configure Neural Compute Stick USB Driver

Windows Users Can Skip to Step 4.

The USB driver on Linux must be configured for the Intel® Neural Compute Stick. To do this run the following commands in a terminal window:

source ~/intel/openvino/bin/
cd ~/intel/openvino/install_dependencies

Step 4. Test the Installation

Plug in the Neural Compute Stick to a USB port on your computer.

Run the following commands for your OS in the terminal and look for the results in the image below.


cd ~/intel/openvino/deployment_tools/model_optimizer/install_prerequisites/
cd ~/intel/openvino/deployment_tools/demo
./ -d MYRIAD


cd C:\"Program Files (x86)"\IntelSWTools\openvino\deployment_tools\demo
.\demo_squeezenet_download_convert_run.bat –d MYRIAD

Look for results like this which indicate successful installation!

r1 results for squeezenet

Next Steps: The ncappzoo

The ncappzoo at is an open source github repository that contains numerous examples with a simple layout and easy to use Makefiles.  This repository is tailored for the Intel® NCS 2 developer community and helps developers get started quickly by focusing on application code that use pretrained neural networks.  Visit the repository's README file via your browser or jump in and clone it now with this command:

git clone

After you've cloned it you can navigate into any app directory (all in the ncappzoo/apps dir) from a terminal window and type:

make run

This is an easy way to explore the examples in the ncappzoo.


Other Examples

There are also other examples in Intel® Distribution of OpenVINO™ Toolkit that you can run.  Here are a few sample commands and their output.

a car detected with traffic camera

Traffic Camera (Object Detection)


cd ~/intel/openvino/deployment_tools/demo
./ -d MYRIAD


cd C:\"Program Files (x86)"\IntelSWTools\openvino\deployment_tools\demo
.\demo_security_barrier_camera.bat -d MYRIAD

This example runs multiple neural networks, such as vehicle attribute and license plate detection and recognition. The demo script runs a command with multiple options. To run the same program manually, enter these commands below in a terminal window or command prompt window.


source ~/intel/openvino/bin/
cd ~/inference_engine_samples_build/intel64/Release
./security_barrier_camera_demo -i ~/intel/openvino/deployment_tools/demo/car_1.bmp -d MYRIAD -m ~/openvino_models/ir/FP16/Security/object_detection/barrier/0106/dldt/vehicle-license-plate-detection-barrier-0106-fp16.xml -d_va MYRIAD -m_va ~/openvino_models/ir/FP16/Security/object_attributes/vehicle/resnet10_update_1/dldt/vehicle-attributes-recognition-barrier-0039-fp16.xml -d_lpr MYRIAD -m_lpr ~/openvino_models/ir/FP16/Security/optical_character_recognition/license_plate/dldt/license-plate-recognition-barrier-0001-fp16.xml


C:\"Program Files (x86)"\IntelSWTools\openvino\bin\setupvars.bat
cd %USERPROFILE%\Documents\Intel\OpenVINO\inference_engine_samples_build_2017\intel64\Release
.\security_barrier_camera_demo.exe -i "C:\Program Files (x86)\IntelSWTools\openvino\deployment_tools\demo\car_1.bmp" -d MYRIAD -m "%USERPROFILE%\Documents\Intel\OpenVINO\openvino_models\ir\FP16\Security\object_detection\barrier\0106\dldt\vehicle-license-plate-detection-barrier-0106-fp16.xml" -d_va MYRIAD -m_va  "%USERPROFILE%\Documents\Intel\OpenVINO\openvino_models\ir\FP16\Security\object_attributes\vehicle\resnet10_update_1\dldt\vehicle-attributes-recognition-barrier-0039-fp16.xml" -d_lpr MYRIAD -m_lpr "%USERPROFILE%\Documents\Intel\OpenVINO\openvino_models\ir\FP16\Security\optical_character_recognition\license_plate\dldt\license-plate-recognition-barrier-0001-fp16.xml"




英特尔的编译器针对非英特尔微处理器的优化程度可能与英特尔微处理器相同(或不同)。这些优化包括 SSE2、SSE3 和 SSSE3 指令集和其他优化。对于在非英特尔制造的微处理器上进行的优化,英特尔不对相应的可用性、功能或有效性提供担保。该产品中依赖于微处理器的优化仅适用于英特尔微处理器。某些非特定于英特尔微架构的优化保留用于英特尔微处理器。关于此通知涵盖的特定指令集的更多信息,请参阅适用产品的用户指南和参考指南。

通知版本 #20110804