The Intel® Distribution of OpenVINO™ toolkit is a tool suite for high-performance deep learning. With continuous product improvement and security updates in mind, Intel releases a new version of the Intel Distribution of OpenVINO toolkit every 3-4 months. This standard
release introduces new features and tools, extends support for additional hardware, libraries, operating systems and public models, and includes security and stability updates. In June 2020, Intel introduced the first Long-Term Support (LTS) release of the Intel Distribution of OpenVINO toolkit. The Intel Distribution of OpenVINO toolkit LTS release is a stable and reliable release maintained for a longer period of time than the standard releases. Additionally, the LTS release reduces potential risk and costs associated with upgrading versions, such as accidentally introducing new bugs and breaking old functionality. The LTS release is intended for the hardening of the functionality in existing features as opposed to the
introduction of new features.
Sample Use Cases
- Standard Release (3-4 releases a year): Users looking to take advantage of new features, tools and support in order to keep current with the advancements in deep learning technologies
- Long-Term Support Release: Users looking for a stable and reliable version that is maintained for a longer period of time, and are looking for little to no new feature changes
Intel® Distribution of OpenVINO™ toolkit Long-Term Support Policy
- Trigger Events. Scenarios, when a new LTS release will be published, are as follows:
- Critical issues, such as application hang or freeze, application crash, memory leak, user-specific security issues
- An environment update occurs and produces a new issue. An environment update includes
- a new operating system (OS) release (e.g. Ubuntu 18.04.3 → 18.04.4)
- new security patches in the OS kernel, in the build compiler, or others
- A Critical or High issue is raised in the Common Vulnerabilities and Exposures (CVEs) database, which affects one of the third-party components used in the OpenVINO™ toolkit
- A security update in a third-party component used inside the OpenVINO™ toolkit, which introduces blocking issues for users
- Intel introduces new hardware that is supported by the Intel Distribution of OpenVINO toolkit
Note: Support for a new hardware instruction set will NOT be provided with LTS releases, because it requires significant functional and structural changes that may accidentally introduce new issues.
- Issue Reporting: To report issues, use the Intel® Premier Support clearly stating the issue, impact and expected timeline.
- Lifecycle: New LTS releases will be introduced every year. In the first release, we guarantee both functional and CVEs fixes backport into the LTS release. For one additional year, we will include backport for fixes that are only CVEs as opposed to those that are both functional and CVE fixes.
- Distribution: List of changes will be provided in the Release Notes - OpenVINO v.2020.3 LTS
- Application Binary Interface (ABI) and Release Usage: There will be no application binary interface (ABI) compatibility for the OpenVINO™ toolkit Inference Engine runtime. It is recommended to re-build your application each time when getting updates.
- Backward Compatibility is supported
- If you created and compiled your application with the version after the last LTS using the Inference Engine Core API, your API calls will be working with the current LTS version too.
- The Inference Engine supports IR version(s) introduced with the LTS release and the IR version(s) introduced in the previous release (e.g. IRv6 for 2020.1; IRv6 and IRv7 for 2020.2).
- Environment variables and directory structures are frozen for LTS releases, which means there will be no structural changes allowed.
- With the introduction of a new major version in the standard releases, backward compatibility may break in the Inference Engine Core API (e.g. 2020.3 → 2021.1).
- Forward Compatibility is NOT supported
- If you created and compiled your application with a newer version than the LTS, there is no guarantee that you can build your application with the Inference Engine from the LTS release.
- Inference Engine does NOT support IR version(s) introduced with the Model Optimizer from a newer version than the LTS release.
Components Supported in the Long-Term Support Policy
- Components that expose the OpenVINO toolkit Inference Engine Core API, such as
- plugins for Inference Engine Core and Inference Engine for CPU, GPU, Myriad, HDDL, GNA and Multi-Device,
- underlying dependencies with low-level libraries—Threading Building Blocks (TBB), nGraph, Intel® Deep Neural Network Library (Intel® DNNL) or Intel® Math Kernel Library for Deep Neural Networks (Intel® MKL-DNN), Compute Library for Deep Neural Networks (clDNN), etc., and
- underlying dependencies with hardware-specific OpenCL compilers, drivers and firmware.
- Development tools in the OpenVINO toolkit, such as
- Model Optimizer,
- Post-training Optimization Tool, and
- the Deep Learning (DL) Workbench.
Functionalities Supported in the Long-Term Support Policy
- Go to the official documentation list of supported devices, model formats, input or output precision, input or output layout, and supported layers.
- For the best-known configurations, review the documentation on how to Get a Deep Learning Model Performance Boost with Intel® Platforms.
- Release Notes include system requirements (supported hardware targets and corresponding operating systems)
Testing Supported in the Long-Term Support Policy
- No regression allowed: Each user issue must be covered with the corresponding regression test.
- White-box: Unit, behavior and functional tests
- Black-box: Performance, backward compatibility, load (7x24) and stress testing
- Security: Code coverage, static analysis, BDBA scans, and others.
Components NOT Supported in the Long-Term Support Policy
- Fixes for the Inference Engine deprecated API will not be introduced.
- The remaining components that are not supported under the LTS policy, such as
- Inference Engine FPGA plugin and its dependencies*
- Open Model Zoo
- Intel® Media SDK
- Deep Learning (DL) Streamer
* The Inference Engine FPGA plugin and its dependencies still contain updates in this release and are still exposed in the package; however, they are not covered under the LTS policy.
Product and Performance Information
Performance varies by use, configuration and other factors. Learn more at www.Intel.com/PerformanceIndex.