Media

Memory usage during H.264 decode...

My application needs to stream multiple H.264 streams of various resolutions. When I decode several 5 MegaPixel H.264 I'm able to decode only a few streams when using hardware decoding. When software decoding, the number of streams is several times higher, but is 50% fewer than when using IPP 7.1 for H.264 decode.  I'm using low latency mode in all of these cases.  My application is a 32-bit application for now and the system has 4 GBytes of memory.

Is there a way to reduce the memory footprint of the H.264 decoder per stream?

 

Simple_endcode_d3d - frame to stream instead of file

I am looking at the simple_encode_d3d.cpp example and trying to write something similiar that will instead take an individual frame at a time instead of reading from a file and then will write a h264 bytestream out to a socket and had a few questions.

First of all, what is the difference between Stage 1 Encoding loop and Stage 2 which says it retrieves buffered encoded frames? From a code perspective, it appears that they do the exact same thing and I'm not sure why you need to do it twice.  How different would this process be if I'm doing it one frame at a time?

ffmpeg sample code Linux Server Edition

I have made small modifications to the ffmpeg sample decode that is available for the windows edition in order to compile it on the Linux edition. I have it compiling but cannot get the libavformat library to link. I am not terribly familiar with cmake so if someone could point out to me how to link libavformat that would be great.

My error:

MFXVideoEncode Init sometimes returns an error

Hi, 
sometimes i get weird behavior when i call MFXVideoEncode.Init(). i get mfxStatus = -17 (device operation failure).
i get it for various frame size in an inconsistent manner (i.e. - usually everything is ok and Init goes successfuly. but sometimes with the same parameters - i get the above error)
when i get the error - i usually call Init() again with the same parameters until it returns MFX_ERR_NONE, which means that it is not a parameter problem
do you have any idea what might be the reason? what should i look for?

System details:

Why does sample_encode.exe crash when I turn on hardware mode or use a custom YUV file?

I'm running the sample_encode.exe sample that came with the SDK and for some reason it crashes when I turn on hardware mode.  Any idea why?

Regarding my hardware, when I look at the DirectX Diagnostic Tool, I find a Intel Xeon CPU E5-1607 running at 3 GHz (quad core) and my graphics card is a NVIDIA Quadro 600.

Macro-blocks artefacts when enconding H264 stream

Hi,

I'm using the Media SDK to encode an H.264 stream and when i play the encoded file I see some unusual macroblocks artefacts in certain frames.

This only happens when I encode with a large number of B-frames compaired to the GOP size.

I've attached a GOP presenting the problem (starting from frame 11 in stream order ) as well as the logs using the media SDK tracer.

Thanks!

Actual bitrate is roughly half of desired bitrate when using MFX_RATECONTROL_CBR

 I am having troubles getting a bitrate that is close to my desired bitrate.  I am using Constate Bitrate Control at 700 kbps with a frame rate of 30 fps.  I keep track of the encoded frame size by looking at the mfxBitStream datalength property, multiplying by 8 to get bits and dividing by num seconds of encoding to attempt to calculate the bitrate.  I seem to consistently calculate a bitrate of 1/2 of the TargetKbps.  Does anything look out of place for my encoder parameters?  Thanks!

Linux media sdk as root

I am working to integrate the media sdk into a streaming video application that runs under linux. Does the sdk have to be initialized in an application run as root? When I try to use it as a non-root user the vaInitialize call in DRMLibVA constructor fails. I have ruled out environment variables by setting  any missing ones for the non-root user.

thanks,

bfp

Linux Media SDK Downscaling Issue

I have set up a transcode chain using the Intel Linux Media SDK, where 1080i content is decoded, deinterlaced, and resized to 720x480, and then saved to a yuv file for analysis. 

I notice that the scaling algorithm used by the media SDK seems to add a lot of aliasing artifacts in the transition from 1920x1080 -> 720x480 or from 1280x720 -> 720x480.  These are very clear on diagonals, or rounded edges, which show as stairstepping artifacts.

Subscribe to Media