The key components of the Intel® Distribution of OpenVINO™ toolkit include the Model Optimizer and the Inference Engine. The Model Optimizer converts and optimizes the models, while the Inference Engine facilitates the deployment of applications.
Add-Ons and Tools
Use the various tools and utilities included in the toolkit to streamline evaluation, benchmarking, and fine-tuning processes. Extend the capabilities of the toolkit beyond its core capabilities with toolkit add-ons.
Models and Demos
The Open Model Zoo includes a collection of pretrained models for common deep learning tasks and support for public models. The repository also includes a collection of code samples and demo applications to speed up development.
Integrated Libraries and Functions
The toolkit delivers access to complementary libraries and functions, including OpenCV*, OpenVX*, and OpenCL*. With OpenCV*, access standardized libraries and functions for computer vision. Use OpenVX* for cross-platform acceleration of applications, and OpenCL* for parallel programming across heterogeneous platforms.
How It Works
Walk-through the development and deployment process from building models to deploying applications across target platforms.