The GPU Duration metric displays the total GPU time for the frame, or for the selected ergs within that frame.
The graphics driver 9.17.10 introduces a new notion of deferred clears. For the sake of optimization, the driver decides whether to defer the actual rendering of clear calls in case subsequent clear and draw calls make it unnecessary. As a result, when clear calls are deferred, the Graphics Frame Analyzer shows their GPU Duration and Samples Written as zero. If later it turns out that a clear call needs to be drawn, the work associated with that clear call gets included in the duration of the erg that was being drawn when this clear call was deferred, not necessarily a clear call. This means that in the Graphics Frame Analyzer metrics associated with a clear call accurately reflect the real work associated with that erg.
If GPU Duration is 80,000, it means that GPU spends around 80 milliseconds to render selected ergs.
When using GPU Duration as a metric to help understand the performance of your game or application, it is important to understand the following:
- If this value is too large, examine the underlying components of the rendering pipeline to see if one or more of these areas are too complex, and therefore causing potential performance bottlenecks.
Check: Pixel Shader Duration, Vertex Shader Duration, Geometry Shader Duration metrics.
- How effective is the GPU working for the selected ergs?
Check: GPU EUs Active, GPU EUs Stalled.