How It Works

Use the Open Model Zoo to find open-source, pretrained, and preoptimized models ready for inference, or use your own deep-learning model.

Open Model Zoo

Run the trained model through the Model Optimizer to convert the model to an Intermediate Representation (IR), which is represented in a pair of files (.xml and .bin). These files describe the network topology and contain the weights and biases binary data of the model.

Model Optimizer Developer Guide

Use the Inference Engine to run inference and output results on multiple processors, accelerators, and environments with a write once, deploy anywhere efficiency.

Inference Engine Developer Guide

Product and Performance Information


Performance varies by use, configuration and other factors. Learn more at