The H.264 video decoder (UMC::H264VideoDecoder) has an annoying problem related to the way it deals with the timestamps of the date that it decodes and how it impacts the timestamp of the frames that it produces.When compressed video data is sent to the decoder (the UMC::MediaData buffer passed as the first argument to H264VideoDecoder::GetFrame()), even if a timestamp is assigned to that video data (UMC::MediaData::SetTime), that timestamp is completely ignored when the input data is parsed into NAL units (see the methodTaskSupplier::AddOneFrame in which the source data is split into NAL units).As a consequence, the frames that are decoded have a timestamp that is set arbitrarily, not based on the timestamp of the input data, but rather on the "framerate" set for the video decoder.When the source data is an MP4 file, there isn't really a concept of "framerate". Instead, each frame has its own timestamps (a decoding timestamp, DTS, and a presentation timestamp, PTS). One could estimate an overall framerate for the video, but dividing the number of frames by the total duration, but that's a hack and it will lead to potential drift in the output timestamps of the decoded frames.So the question is: is there a way to fix this? How can one set a timestamp on the source data sent to the decoder, and have that timestamp be reflected in the decoded frames?
For more complete information about compiler optimizations, see our Optimization Notice.