I have converted my models to FP32 IR successfully and was able to do inference without any issues. I also converted the same models to FP16 IR but when I try to load these FP16 models I get this error with CPU option:
Error loading model into plugin: The plugin does not support models of FP16 data type.
When I selected the GPU option, I got this message:
Error loading model into plugin: [NOT_IMPLEMENTED] Input image format FP16is not supported yet...
I'm running it on Windows 10.Should I have to change anything else for using FP16 IR models?