Edge Inference

 
Develop your computer vision applications using the Intel® DevCloud, which includes a preinstalled and preconfigured version of the Intel® Distribution of OpenVINO™ toolkit. Access reference implementations and pretrained models to help explore real-world workloads and hardware acceleration solutions.

Sign Up for Beta

What You Can Do

Deep learning for computer vision

Traditional computer vision

Hardware acceleration

Inference performance comparisons

What's Inside

Hardware

  • 6th to 8th generation Intel® Core™ processors
  • Intel® Xeon® processors
  • Intel® IoT Developer Kits
  • Intel® Neural Compute Stick
  • Intel® Processor Graphics
  • Image Processing Units
  • Intel® FPGAs
  • Intel® Vision Accelerator Design
  • Intel® Movidius™ Vision Processing Unit

Frameworks

  • TensorFlow*
  • Intel® Optimization for Caffe*
  • ONNX*
  • Apache MXNet*

Topologies

  • Tiny YOLO* version 3
  • Full DeepLab version 3
  • Bidirectional long short-term memory (LSTM)

Tools

  • Intel® Distribution of OpenVINO™ toolkit

Optimized API Calls

  • OpenCV*
  • OpenCL™
  • OpenVX*

What You Get

  • Latest Intel® hardware
  • Intel® optimized frameworks
  • Latest computer vision tools
  • 50 GB of file storage

Support

Our team monitors the community forum Monday through Friday, 9:00 a.m. - 5:00 p.m., Pacific daylight time.

Get Help