What is the most simple way to decode as RGB32?

What is the most simple way to decode as RGB32?

I'm using decode from system memory buffers into system memory buffers, using hardware acceleration. Unfortunately, this gives me NV12 buffers that I then have to convert on the CPU to RGB32 for presentation, which is very slow. Is it possible to have the conversion be performed by the hardware, and what would be the simplest way of doing that?

I have checked the reference manual, and under Hardware Acceleration there is a section explaining that under D3D9 and D3D11, the decoder can have a "Decoder Render Target" of type RGB32, however I don't understand what that entails and how to set it up. Does that require using D3D surfaces as output? Where can you specify that you want RGB32 as output? Is there a way to still automatically convert back to system memory buffers?


4 posts / 0 new
Last post
For more complete information about compiler optimizations, see our Optimization Notice.

media sdk vpp module,it can nv12 to rgb32.

Best Reply

Yes, using opaque memory (out of decode and into VPP) is a good way to decode to RGB System memory.

Please let me know if you have any issues with this.


I am experiencing this very same issue. Does anyone from Intel have a specific answer to this that is actually helpful?

Leave a Comment

Please sign in to add a comment. Not a member? Join today