QuickSync for Live Transcoding

QuickSync for Live Transcoding

Dear all,

I have got a question:

Is is possible to use Intel Media SDK 2012 or 2013 for the Live Transcoding? (I have used it for file transcoding but I dont know if I can use it for Live as well?)

Appreciate.

7 posts / 0 new
Last post
For more complete information about compiler optimizations, see our Optimization Notice.

Hi Amir,

Intel Media SDK can definitely be utilized for that use case. A key aspect is low latency. Low latency and and other features related to encode/decode for video streaming or video conferencing are supported by the SDK. For an introduction please refer to the following white paper:
http://software.intel.com/en-us/articles/video-conferencing-features-of-...  

Media SDK samples: sample_decode and sample_videoconf also illustrate how to use these features.

Regards,
Petter 

Dear Peter, 

Thanks for your reply. Yes you are right. but sample_viodeoconf will get YUV as input from the camera or any other Live Raw source.

what I exactly looking for is: Live transcoding for instance a stream of MPEG-2 to output stream H.264 

sample_multi_transcoding is perfoming this function but on the file not Live. so, what do you think in this case? any clue?

Thanks.

Hi Amir,

Intel Media SDK by itself does not provide a complete end to end solution for the use case you are describing.

The SDK sample "sample_multi_transcode" can be configured in the same way as described in the samples related to video conferencing to achieve low latency transcode. But such setup will only trancode from elementary to elementary video stream. Audio codec and mux/demux is outside the scope of Intel Media SDK. If you are looking to implement solution for streaming you will likely need to integrate with such features and additional NW streaming components.

That said, Intel Media SDK can quite easly be integrated with the FFmpeg framework, which do have most of what is needed to realize a NW streaming solution.

For details on how to implement rudimentary FFmpeg integration with Media SDK please refer to the following whitepaper:
http://software.intel.com/en-us/articles/integrating-intel-media-sdk-wit...

We are planning to publish a more complete FFmpeg integration example externally within the next few weeks. If this is of interest to you, please keep an eye out on the Media SDK frontpage on Intel VC Source. http://software.intel.com/en-us/vcsource/tools/media-sdk

Regards,
Petter 

Dear Peter,

Thanks for your great reply.

I have downloaded Media SDK 2013 and I previously got familiared with new samples that has the FFMPEG integration feature.

my problem is not relating to mux/demux. I can manage to perform such action with FFMPEG or other tools. I want to clear what I need below:

simpley, suppose I receive Live streams via DVB-T receiver and using VLC player I can play the stream.then I can demux the video and audio streams into 2 separate streams. now I need to be able to Transcode this Live video stream which has the codec format MPEG-2 or H.264 and generate Live output either with the format MPEG-2 or H.264

in the video_conferece sample of media SDK, it works in such a way that it receives the YUV from camera source for instance and encodes the YUV raw input. and does not process the compressed stream. (please if I am wrong let me know).

from the programming point of view, to be able to perform the Live Transcoding I need to create FIFO pipe(linux like) feature in the windows operating system. now, suppose I did this, can I use the media SDK functionality to process the FIFO pipe bit-stream?

my another question is: is there any media SDK for Linux? like what is available for windows.

finally, I appreciate your patience with me for asking several questions.

Regards,

Hi Amir,

From your description I believe you can achieve what you want by modifying the Media SDK sample_multitranscode sample.
By following the same techniques described in the video conferencing white paper you can effectively produce a synchronous pipeline which means one frame in / one frame out.

The following parameter setting are central:
AsyncDepth = 1
GopRefDist = 1
MaxDecFrameBuffering = 1 

However, note that the transcode sample reads several frames from bitstream buffer if available. But assuming you have a demuxer hooked up, then you can instead feed the pipeline one frame at a time, as long as you are using the flag MFX_BITSTREAM_COMPLETE_FRAME as described in paper and in the sample_decode sample.

Intel Media SDK is currently not available for Linux. We will share more on this topic with the community soon.

Regards,
Petter

Dear Peter,

Thank you very much for your solution.

Kind Regards,

Amir

Leave a Comment

Please sign in to add a comment. Not a member? Join today