HD 4400 and DXVA_ModeHEVC_VLD_Main performance

HD 4400 and DXVA_ModeHEVC_VLD_Main performance

I've implemented an HEVC directshow decoder filter successfully using your DXVA2 DXVA_ModeHEVC_VLD_Main profile. It's really fantastic to see you guys taking the lead in video card vendors supporting the new DXVA2 bits :)) Kudos and congratulations to your team for doing an excellent job. For the most part HEVC VLD is working just fine.

Whilst a 4K test stream seems to decode well, I'm finding that igdumdim32.dll is actually causing applications to become CPU-bound running the current drivers 5/17/2014 10.18.10.3621. Because igdumdim32 is doing 95% of the entire application's "cpu work" during CVideoDecodeDevice Execute(), what we get is smooth video for a few seconds and then jitter as the bottleneck begins to take its toll. Interesting to see such elevated CPU caused by the igdumdim32.dll DXVA2 implementation. A player application uses about 30% system time overall (still, a heck of a lot better than any 1fps pure software/reference code will achieve pegging the i7 to 99%!)

Is this experience more or less "expected" behaviour for a HD 4400 chip running on an iris display? I'm just trying to get an idea on how feasible 4k video is on current gen hardware or if 4k is still an unrealistic goal, or if you plan on moving a lot more of the work onto the GPU in a future driver release.

Thanks again for the great effort. It's always a pleasure doing dxva work on Intel gear.

- Simon

1 post / 0 new
For more complete information about compiler optimizations, see our Optimization Notice.