• 2021.2
  • 06/30/2021
  • Public Content

How it Works

Edge Insights for Autonomous Mobile Robots
EI for AMR
) modules are deployed via Docker* containers for enhanced Developer Experience (DX), support of Continuous Integration and Continuous Deployment (CI/CD) practices and flexible deployment in different execution environments, including robot, development PC, Edge Server, and Cloud.
This section provides an overview of the modules and services featured with
Edge Insights for Autonomous Mobile Robots

Modules and Services

Intel® oneAPI Base Toolkit (Base Kit) and
Intel® Distribution of OpenVINO™ toolkit
middleware layered architecture abstracts hardware dependencies from the algorithm implementation.
ROS2 with DDS service distribution layer is used as a message bus. This Publisher-Subscriber architecture based on ROS2 topics decouples data providers from consumers.
Camera and LIDAR sensor data is abstracted with ROS2 topics.
Video streaming processing pipelines are supported by GStreamer*. It decouples sensor ingestion, video processing and AI object detection via
toolkit DL Streamer framework.
Also, more complex computational graphs that decouple Sense-Plan-Act autonomous mobile robot applications can be implemented using ROS2 topic registration.
This diagram shows the software components included in the
EI for AMR
package. The software stack will keep evolving iteratively with additional algorithms, applications, and third-party ecosystem software components. For ease of use, the complete software stack is provided in a single container called
Full Flavour container
. This container is constructed hierarchically by extending the
OpenVINO container
, which itself extends the
ROS2 container
. For storage space savings, you can choose to run just the
ROS2 container
, or the
OpenVINO container
depending on the needs of your applications.
  • The ROS2 container includes the ROS2 middleware and tools, Intel® RealSense™ SDK and ROS2 wrapper, GStreamer* and build tools, ROS2 packages (Cartographer, Navigation, RTAB_MAP) and the Fast Mapping application (the Intel-optimized version of octomap).
  • The OpenVINO container includes the ROS2 container, as well as the
    development toolkit, the
    DL Gstreamer plugins and the Wandering demonstration application,
  • The AMR Full Flavour container includes the OpenVINO container, as well as the Intel® oneAPI Base Toolkit, the DPC++ compatibility tool and profiler, analyzer tools.
EI for AMR
software stack is based on software supported by and part of the underlying hardware platform, their respective UEFI based boot, and their supported Linux* operating system. For details, see Requirements.


Edge Insights for Autonomous Mobile Robots
relies on standard Intel® Architecture Linux drivers included and upstreamed in the Linux kernel from and included in Ubuntu* distributions. These drivers are not included in the
EI for AMR
package. Examples include WiFi , Ethernet, Integrated GPU, USB, PCIe, and others.


EI for AMR
integrates the following middleware packages:
  • Robot Operating System (ROS) software stack V2 (ROS2) from the open source project (, able to execute sample ROS2 tutorial applications.
    • The Robot Operating System (ROS) is a set of open source software libraries and tools for building robot applications.
    • ROS2 depends on other middleware, like the Object Management Group (OMG) Data Distribution Service (DDS) connectivity framework that uses a publish–subscribe pattern.
  • 2021.2 supporting heterogeneous execution across an Intel CPU, integrated graphics, Intel® Neural Compute Stick 2 (Intel® NCS2), and Intel® Vision Accelerator Design with Intel® Movidius™ Myriad™ VPUs.
  • Intel® oneAPI Base Toolkit, able to execute oneAPI base toolkit sample applications. The Intel® oneAPI Base Toolkit is a core set of tools and libraries for developing high-performance, data-centric applications across diverse architectures. It features an industry-leading C++ compiler and the Data Parallel C++ (DPC++) language, an evolution of C++ for heterogeneous computing.
    For information about oneAPI training and a presentation of the CUDA converter, refer to:
  • RealSense SDK and the RealSense ROS2 Wrapper, able to execute Intel® RealSense™ ROS2 Wrapper sample applications.
  • GStreamer* including support for libv4l2 video sources and GStreamer plugins.


Edge Insights for Autonomous Mobile Robots
includes reference algorithms as well as deep learning models as working examples for the following automated robot control functional areas:
  • ROS2 Cartographer 2D LIDAR SLAM. A cartographer is a system that provides real-time simultaneous localization and mapping (SLAM), here based on real-time 2D LIDAR sensor data. It is used to generate as-built floor plans or maps.
  • ROS2 RealSense VSLAM. Visual Simultaneous Localization and Mapping (vSLAM) can map the location and create a 3D virtual map, based on computer vision.
  • ROS2 RealSense FastMapping Algorithm. An additional algorithm to create a map of a robot’s surrounding, based on Intel® RealSense™ video and IMU sensor data.
  • ROS2 Navigation. The ROS2 Nav2 Navigation Stack seeks to find a safe way to have a mobile robot move from point A to point B. This will complete dynamic path planning, compute velocities for motors, detect and avoid obstacles, and structure recovery behaviors.
  • OpenVINO™ Model Zoo. Optimized deep learning models and a set of demos to expedite development of high-performance deep learning inference applications. You can use these pre-trained models instead of training your own models to speed up the development and production deployment process.


  • Fast Mapping. An algorithm on top of ROS (Robotic Operating System) that generates and maintains a volumetric representation of the environment given the depth image input of Intel® RealSense™ camera (e.g. D435i) and the position from a localization algorithm, such as GSlam, RTABMap, vSLAM, etc.
  • Wandering AI Application. The RealSense AI Wandering ROS2 sample application demonstrates the combination of the middleware, algorithms, and the ROS2 navigation stack to move a robot around a room avoiding hitting obstacles, updating a local map in real time exposed as ROS topic, and publish AI-based objects detected in another ROS topic. It uses the robot’s sensors and actuators that are available from the robot’s hardware configuration.


ROS Tools
Edge Insights for Autonomous Mobile Robots
  • rviz2 to visualize ROS topics.
  • ROS rqt is a software framework of ROS that implements the various GUI tools in the form of plugins.
  • colcon (collective construction) is a command line tool to improve the workflow of building, testing, and using multiple software packages. It automates the process, handles the ordering, and sets up the environment to use the packages.
Edge Insights for Autonomous Mobile Robots
  • The Gazebo* robot simulator makes it possible to rapidly test algorithms, design robots, perform regression testing, and train AI system using realistic scenarios. Gazebo offers the ability to simulate populations of robots accurately and efficiently in complex indoor and outdoor environments. A robust physics engine, high-quality graphics, and convenient programmatic and graphical interfaces is available at your fingertips.


The table below lists terminology used in this document.
Central Processing Unit
EI for AMR
Edge Insights for Autonomous Mobile Robots
Fast Mapping
Graphics Processor Unit
Inference Engine
Light Detection and Ranging
Neural Network
Robot Operating System
Software Development Kit
Simultaneous Localization And Mapping
Visual Simultaneous Localization and Mapping

Product and Performance Information


Performance varies by use, configuration and other factors. Learn more at