Get Started with Intel® Neural Compute Stick 2

Follow the step-by-step instructions below to setup your Intel® Neural Compute Stick 2 (Intel® NCS 2) or the original Intel® Movidius™ NCS.  Also, check out the getting started videos for your platform:

Step 1. Gather Your Equipment

 Intel Neural Compute Stick 2

Before you start, make sure you have the following:

  • An x86-64 host computer with Windows 10® or Ubuntu*16.04 for the Intel® Distribution of OpenVINO™ Toolkit.
  • Intel® Neural Compute Stick 2 (Intel® NCS 2). Buy Now
  • An internet connection to download and install the Intel® Distribution of OpenVINO™ toolkit.

Note This article is based on the 2019 R1 release of the Intel® Distribution of OpenVINO™ toolkit.

Note To use a Raspberry Pi as the host for the Intel® NCS 2, it’ s recommended that you still follow the getting started instructions on this page to install the full Intel® Distribution of the OpenVINO™ toolkit on one of the supported platforms (Windows, Linux*) and then install the Inference Engine on your Raspberry Pi and review the Raspberry Pi workflow.

Migrate your projects: The Intel® Distribution of OpenVINO™ toolkit also supports the original Intel® Movidius™ NCS device. Existing Intel® Movidius™ Neural Compute SDK (NCSDK) projects can be transitioned to The Intel® Distribution of OpenVINO™ toolkit.

Step 2. Install the OpenVINO™ Toolkit

Download the appropriate version of the Intel® Distribution of the OpenVINO™ toolkit for your host computer (Windows or Linux.). This guide assumes the full package installation is downloaded. Then, follow the installation instructions for your OS and customize the installation as shown in the image below.

recommended customization

For Linux*

Run the following commands:

cd ~/Downloads 
tar xvf l_openvino_toolkit_<VERSION>.tgz
cd l_openvino_toolkit_<VERSION>
sudo -E ./install_openvino_dependencies.sh 
./install_GUI.sh

For Windows*

Double click the downloaded w_openvino_toolkit_<VERSION>.exe file.

Follow the on-screen instructions to continue the installation process and customize the installation as desired. Install any dependencies shown during the installation prior to continuing to step 3.

Return to this page and continue with Step 3 after successful installation to assure your installation is correct.

Step 3. Configure Neural Compute Stick USB Driver

Windows Users Can Skip to Step 4.

The USB driver on Linux must be configured for the Intel® Neural Compute Stick. To do this run the following commands in a terminal window:

source ~/intel/openvino/bin/setupvars.sh cd ~/intel/openvino/install_dependencies ./install_NCS_udev_rules.sh

Step 4. Test the Installation

Plug in the Neural Compute Stick to a USB port on your computer.

Run the following commands for your OS in the terminal and look for the results in the image below.

Linux

cd ~/intel/openvino/deployment_tools/model_optimizer/install_prerequisites/
./install_prerequisites.sh
cd ~/intel/openvino/deployment_tools/demo
./demo_squeezenet_download_convert_run.sh -d MYRIAD

Windows

C:
cd C:\"Program Files (x86)"\IntelSWTools\openvino\deployment_tools\demo
.\demo_squeezenet_download_convert_run.bat –d MYRIAD

Look for results like this which indicate successful installation!

r1 results for squeezenet

Next Steps: The ncappzoo

The ncappzoo at http://www.github.com/movidius/ncappzoo is an open source github repository that contains numerous examples with a simple layout and easy to use Makefiles.  This repository is tailored for the Intel® NCS 2 developer community and helps developers get started quickly by focusing on application code that use pretrained neural networks.  Visit the repository's README file via your browser or jump in and clone it now with this command:

git clone http://github.com/movidius/ncappzoo

After you've cloned it you can navigate into any app directory (all in the ncappzoo/apps dir) from a terminal window and type:

make run

This is an easy way to explore the examples in the ncappzoo.

 

Other Examples

There are also other examples in Intel® Distribution of OpenVINO™ Toolkit that you can run.  Here are a few sample commands and their output.

a car detected with traffic camera

Traffic Camera (Object Detection)

Linux

cd ~/intel/openvino/deployment_tools/demo
./demo_security_barrier_camera.sh -d MYRIAD

Windows

C:
cd C:\"Program Files (x86)"\IntelSWTools\openvino\deployment_tools\demo
.\demo_security_barrier_camera.bat -d MYRIAD

This example runs multiple neural networks, such as vehicle attribute and license plate detection and recognition. The demo script runs a command with multiple options. To run the same program manually, enter these commands below in a terminal window or command prompt window.

Linux

source ~/intel/openvino/bin/setupvars.sh
cd ~/inference_engine_samples_build/intel64/Release
./security_barrier_camera_demo -i ~/intel/openvino/deployment_tools/demo/car_1.bmp -d MYRIAD -m ~/openvino_models/ir/FP16/Security/object_detection/barrier/0106/dldt/vehicle-license-plate-detection-barrier-0106-fp16.xml -d_va MYRIAD -m_va ~/openvino_models/ir/FP16/Security/object_attributes/vehicle/resnet10_update_1/dldt/vehicle-attributes-recognition-barrier-0039-fp16.xml -d_lpr MYRIAD -m_lpr ~/openvino_models/ir/FP16/Security/optical_character_recognition/license_plate/dldt/license-plate-recognition-barrier-0001-fp16.xml

Windows

C:
C:\"Program Files (x86)"\IntelSWTools\openvino\bin\setupvars.bat
cd %USERPROFILE%\Documents\Intel\OpenVINO\inference_engine_samples_build_2017\intel64\Release
.\security_barrier_camera_demo.exe -i "C:\Program Files (x86)\IntelSWTools\openvino\deployment_tools\demo\car_1.bmp" -d MYRIAD -m "%USERPROFILE%\Documents\Intel\OpenVINO\openvino_models\ir\FP16\Security\object_detection\barrier\0106\dldt\vehicle-license-plate-detection-barrier-0106-fp16.xml" -d_va MYRIAD -m_va  "%USERPROFILE%\Documents\Intel\OpenVINO\openvino_models\ir\FP16\Security\object_attributes\vehicle\resnet10_update_1\dldt\vehicle-attributes-recognition-barrier-0039-fp16.xml" -d_lpr MYRIAD -m_lpr "%USERPROFILE%\Documents\Intel\OpenVINO\openvino_models\ir\FP16\Security\optical_character_recognition\license_plate\dldt\license-plate-recognition-barrier-0001-fp16.xml"

References

For more complete information about compiler optimizations, see our Optimization Notice.