NV12 Interlaced Field Pairs in Memory

NV12 Interlaced Field Pairs in Memory

Hi there,

I would like to know the way that H.264 Encoder saves the NV12 interlaced data (field pair) in memory.

Are the two fields saved one after another, or saved interlaced between each other?

Any advice appreciated.

Joyah.

- "What hurts more, the pain of hard work, or the pain of regret?"
5 Beiträge / 0 neu
Letzter Beitrag
Nähere Informationen zur Compiler-Optimierung finden Sie in unserem Optimierungshinweis.

Hi Joyah,

The native color format used by Media SDK follows the NV12 pixel format as is described on the FourCC webpage: http://www.fourcc.org/yuv.php#NV12
In essence it's a Y plane followed by an interleaved U/V plane.

Note that the Media SDK sample code has file access utility functions that write/read to RAW YV12 file. The reason YV12 is used as storage format is due to the fact that it is a more common YUV format (http://www.fourcc.org/yuv.php#YV12) (NOT interleaved U/V) for storing raw frame data. So, to achieve this NV12 to YV12 or the reverse conversion is performed by the file access utility functions.

Regards,
Petter

I hope this is helpful,
In both:
"C:\Program Files\Intel\Media SDK 2012 R3\doc\mediasdk-man.pdf"
"C:\Program Files\Intel\Media SDK 2012 R3\doc\Intel_Media_Developers_Guide.pdf"

Search for 'interlace', and read about the options about how to setup the various structs for frame input.

Most importantly, you will see mfxExtCodingOption contains flags for controlling whether interlaced video is created.
Also, you will see the PicStruct enum contains options for informing which scan lines represent which fields when encoding.

Expect to experiment and spend some time making sure you get the desired bitstream when changing these parameters.

Good luck
Cameron

Cita:

Petter Larsson (Intel) escribió:

Hi Joyah,

The native color format used by Media SDK follows the NV12 pixel format as is described on the FourCC webpage: http://www.fourcc.org/yuv.php#NV12
In essence it's a Y plane followed by an interleaved U/V plane.

Note that the Media SDK sample code has file access utility functions that write/read to RAW YV12 file. The reason YV12 is used as storage format is due to the fact that it is a more common YUV format (http://www.fourcc.org/yuv.php#YV12) (NOT interleaved U/V) for storing raw frame data. So, to achieve this NV12 to YV12 or the reverse conversion is performed by the file access utility functions.

Regards,
Petter

Hi Peter,
Since I got this sentence in msdk-man.pdf at Page.20
"Note: NV12 is the only supported native encoding and decoding format."
It really makes me confused that which is the more preferred format, comparing YV12 and NV12?
Thanks
Joyah

- "What hurts more, the pain of hard work, or the pain of regret?"

Cita:

camkego escribió:

I hope this is helpful,
In both:
"C:\Program Files\Intel\Media SDK 2012 R3\doc\mediasdk-man.pdf"
"C:\Program Files\Intel\Media SDK 2012 R3\doc\Intel_Media_Developers_Guide.pdf"

Search for 'interlace', and read about the options about how to setup the various structs for frame input.

Most importantly, you will see mfxExtCodingOption contains flags for controlling whether interlaced video is created.
Also, you will see the PicStruct enum contains options for informing which scan lines represent which fields when encoding.

Expect to experiment and spend some time making sure you get the desired bitstream when changing these parameters.

Good luck
Cameron

Hi Cameron,
I'll look these two pdfs for more information, thank you so much!
Joyah

- "What hurts more, the pain of hard work, or the pain of regret?"

Kommentar hinterlassen

Bitte anmelden, um einen Kommentar hinzuzufügen. Sie sind noch nicht Mitglied? Jetzt teilnehmen