How can we measure GPU usage(% memory usage/RAM usage) when inference is running on GPU?
intel_gpu_tools doesn't seem to provide these info.
Thanks and Regards,
Dear Shetty, Harsha
This is not an OpenVino issue. This forum is dedicated to Model Optimizer and Inference Engine support. Memory/RAM usage while inference is running on a GPU is no different than using an Intel GPU for gaming from this perspective. Kindly post your question to the below forum: