30 Search Results

Refine by

    Results for:

Installing the OpenVINO™ Toolkit for Windows* 10

NOTE: The OpenVINO™ toolkit was formerly known as the Intel® Computer Vision SDK.
These steps apply to Windows* 10. For Linux* instructions, see the Linux installation guide

Introduction

The OpenVINO...

Installing the OpenVINO™ Toolkit for Linux*

NOTE: The OpenVINO™ toolkit was formerly known as the Intel® Computer Vision SDK.
These steps apply to Ubuntu*, CentOS*, and Yocto*. If you are using OpenVINO™ Toolkit on Windows, please see Installation Guide...

OpenVINO™ Toolkit Release Notes

Introduction

NOTE: The OpenVINO™ toolkit was formerly known as the Intel® Computer Vision SDK

The OpenVINO™ toolkit is a comprehensive toolkit for quickly developing applications and solutions that emulate...

Get Started with the IEI Tank AIoT Dev Kit* and Arduino Create*

Last updated: June 13, 2018

Follow these steps to connect your IEI* Tank AIoT Dev Kit to Arduino Create* and begin working with your development kit. This guide assumes you've already set up and powered your system according to the guide included in the box.

You’ll...

Inference Engine Developer Guide

Deploying deep learning networks from the training environment to embedded platforms for inference is a complex task. The Inference Engine deployment process converts a trained model to an Intermediate Representation.

Model Optimizer Developer Guide

Introduction

Model Optimizer is a cross-platform command-line tool that facilitates the transition between the training and deployment environment, performs static model analysis, and adjusts deep learning models for optimal execution on end-...

Inference Engine Samples

Image Classification Sample Description

The Image Classification sample application does inference using image classification networks, like AlexNet* and GoogLeNet*. The sample application reads command line parameters and loads a network and...

Intel's OpenVX Developer Guide

Last updated: May 16, 2018

Intel's OpenVX API is delivered as part of the Open Visual Inference & Neural network Optimization (OpenVINO™) toolkit, which is a software development package for development and optimization of computer vision and image processing...

Using the Model Optimizer to Convert Caffe* Models

Introduction

The Model Optimizer is a cross-platform command-line tool that facilitates the transition between the training and deployment environment, performs static model analysis, and adjusts deep learning models for optimal execution on end...

API Reference for OpenVINO™ toolkit Kernel Extensions for OpenVX*

Intel Extensions for OpenVX*

Version 0.4

Using the Model Optimizer to Convert MXNet* Models

Introduction

NOTE: The OpenVINO™ toolkit was formerly known as the Intel® Computer Vision SDK.

The Model Optimizer is a cross-platform command-line tool that facilitates the transition between the training...

Installing the OpenVINO™ Toolkit for Linux* with FPGA Beta Support

Introduction

NOTES:
The OpenVINO™ toolkit was formerly known as the Intel® Computer Vision SDK.
These steps apply to Ubuntu*, CentOS*, and Yocto*.

The OpenVINO™ toolkit quickly deploys applications...

Pages