Muxing with Intel® Media Software Development Kit

Download Article and Source Code

Download Muxing With Intel Media SDK (PDF 514KB)
Download Muxing Source Code (Note: Licensing terms match Media SDK 2.0)



The Intel® Media Software Development Kit (Intel® Media SDK) is a software development library that exposes the media acceleration capabilities of Intel platforms for video decoding, video encoding, and video pre/post processing. Intel® Media SDK helps developers rapidly develop software that accesses hardware acceleration for video codecs with automatic fallback on software if hardware acceleration is not available.

Intel® Media SDK is available free of charge and can be downloaded from here: (Gold release targeting current generation of platforms and Beta release targeting features for next generation platforms)

Intel® Media SDK is delivered with a wide range of application samples that illustrate how to use the SDK to encode, decode and transcode media to/from elementary video streams. Aside from the binary DirectShow* filters supplied with Intel® Media SDK, the SDK does not provide the capability of muxing encoded H.264 or MPEG-2 frames into common media containers.

This article presents a method that can be used to perform muxing (into mp4 and mpg containers) of encoded media generated by Intel® Media SDK. The article details how to use audio/video codec components part of Intel® Integrated Performance Primitives (Intel® IPP) samples to create a muxing solution based on the Intel® Media SDK “sample_encode” project.

“Muxing” is the process of, according to container type standards, combining video and audio frames with corresponding time stamps and media meta data into a file that can be played back by common media player tools such as Windows* Media Player or VLC*. While audio encoding and audio muxing is outside the scope of this article, a similar approach could be used to add that functionality.

Building the solution described in this article requires the following components:

In addition, the developer also needs Microsoft Visual Studio* and Microsoft Windows* SDK. (Microsoft Visual Studio* 2008 and Microsoft Windows* SDK 7.0 were used when building this solution; We suggest developers use the same configuration to minimize changes)

*Note: The same solution could easily be migrated to Intel® Media SDK 3.0 beta samples.





The following figure illustrates the process of encoding and muxing video frames into a media container. Note that audio encoding and muxing is out of scope of this article. However, a developer could quite easily extend the capability to also handle muxing of encoded audio samples into a container.

Figure 1 - Overview of Encode – Muxing process

The solution detailed in this article was created using Intel® Media SDK 2.0 with corresponding video encode console sample, “sample_encode”. The solution also depends on the use of Intel® IPP with adjoining audio/video codec samples.

The below sections describe the components used as part of the solution in more detail.


Intel® Media SDK

Intel® Media SDK supports hardware accelerated and software optimized media libraries for video encode, decode and processing functionality on Intel platforms. The optimized media libraries are built on top of Microsoft DirectX*, DXVA APIs and platform graphics drivers. Intel® Media SDK exposes the HW acceleration features of Intel® Quick Sync Video (Intel® QSV) built into 2nd generation Intel® Core™ processors.

The figure below provides a high level overview of where Intel® Media SDK fits into the software stack.

Figure 2 - Overview of Intel® Media SDK

For extensive details on all the features of Intel® Media SDK please refer to the manual provided with the SDK.


Intel® Integrated Performance Primitives (Intel® IPP)

Intel® Integrated Performance Primitives (Intel® IPP) is an extensive library of multi-core-ready, highly optimized software functions for digital media and data-processing applications. Intel® IPP offers thousands of optimized functions covering frequently-used fundamental algorithms.

The solution described in this article utilizes Intel® IPP 7.0 as part of the Intel® IPP sample code muxing components.

The following picture gives an overview of the capabilities of Intel® IPP.

Figure 3 - Overview of Intel® IPP

For further details on the Intel® IPP licensing terms, please refer to the Intel® IPP product page or the following link: /en-us/articles/intel-integrated-performance-primitives-faq.


Intel® IPP Samples

This article uses the audio/video codec samples provided as a separate download with the core Intel® IPP package. Only the “audio-video-codecs” sub folder part of the samples is used in the following solution.

Inside the “audio-video-codecs” sub folder there is a “doc” folder containing a document with further details on how to use the range of sample components (including the mp4 and mpeg muxers used in this article).



In this chapter we will detail all the steps required to implement muxing support for encoded Intel® Media SDK MPEG-2 and H.264 streams. This includes instructions on how to make the required changes to build projects, library/include file dependencies, source code and how to build the Intel® IPP sample components.

*Note that muxing could also be performed after complete Intel® Media SDK encode using third party tool such as ffmpeg, but this naturally allows for much less flexibility. For instance, to mux H.264 elementary stream into mp4 container the following ffmpeg command could be used: “ffmpeg –f h264 –i stream.264 –vcodec copy out.mp4”


Building Intel® IPP sample muxer components

First we will build the Intel® IPP audio video codec sample components (these components are all in the “UMC” name space) that we will access as libraries in our solution.

Make sure you have installed at least version 7.0 of Intel® IPP (or Intel® Composer XE that includes Intel® IPP) and the Intel® IPP samples.

Also please set “IPPROOT” system environment variable to the Intel® IPP root folder in your file system. For instance: IPPROOT = C:\Program Files\Intel\ComposerXE-2011\ipp\

Execute the following steps to build the Intel® IPP sample components required for the muxing capability:

  1. Open Command prompt
    (depending on where you execute from you may have to open in Administrative mode)
  2. Change directory to Intel® IPP environment setup folder
    For instance: C:\Program Files\Intel\ComposerXE-2011\ipp\bin\
  3. Execute the Intel® IPP environment script
    ippvars.bat <arch>
    (where <arch> is “ia32” or “intel64”)
  4. Change directory to Intel® IPP samples audio video codecs folder
    <location of Intel® IPP samples>\audio-video-codecs\
  5. In this folder you will find a Makefile. Modify the Makefile compile flag M_FLAG and set it to “/MTd” instead of the default “/MD”. (Multi-threaded debug linked library) (This is required to match the project settings of the Intel® Media SDK sample_encode project we base the development on)
  6. Run the build script
    For 32 bit: build_ia32.bat cl9
    For 64 bit: build_intel64.bat cl9
    “cl9” indicates the version of Microsoft Visual Studio*. In the example above “9” indicates version 9 of the tool, which equals Microsoft Visual Studio* 2008. Modify according to your setup (there is more info on this in \audio-video-codecs\ReleaseNotes.htm).
  7. Note that the build is somewhat time-consuming. After compilation make sure all the base libraries compiled ok; don’t worry if the application projects failed to compile. The binary libraries are created in:
    <location of Intel® IPP samples>\audio-video-codecs\_bin\<arch>_cl9\lib\

Set “IPPSAMPLESROOT” system environment variable to the root folder of the Intel® IPP samples. For instance: IPPSAMPLESROOT = C:\Program Files\Intel\MediaSDK\ipp_samples The folder “audio-video-codecs” should be located in the folder of the above path.


Integrating muxer components with encoder

Note that the code below follows the same programming conventions and macros from the existing Intel® Media SDK samples. Please refer to the Intel® Media SDK sample code for further details.

For exact details on the code changes please refer to the provided Microsoft Visual Studio* solution/projects. Changes to Microsoft Visual Studio* “sample_encode” project

Since the following solution is based on the existing “sample_encode” project part of the Intel® Media SDK sample code, please first make a backup copy of the “sample_encode” and “sample_common” project folders before making the changes.

Make sure environment variables IPPROOT and IPPSAMPLESROOT are set as described in the previous chapter.

Please add the following to your Microsoft Visual Studio* project target configuration (or configurations if you plan to compile for several targets) part of the “sample_encode” solution (.sln file).

Add the following to your list of project (sample_common and sample_encode) include directories:


Add the following to your list of project (sample_encode) library directories:


(Replace ia32_cl9 with appropriate folder name reflecting your architecture and tool version, as described in the previous chapter)


(ia32 or intel64 depending on selected architecture)

Add the following to your list of project (sample_encode) library dependencies:


Add the following to your list of project (sample_encode) ignored libraries:


For this specific solution we are using statically linked of Intel® IPP libraries which will make the executable a bit larger, but also more portable, compared to a solution using Intel® IPP DLL.


Usage scenario / command-line input

To expose the new muxing feature we add a new “-mux” command line option.

Using this new option we encode raw YUV video into a H.264 video stream muxed into a mp4 container with the following example command:


sample_encode.exe h264 –i raw.yuv –o out.mp4 –w 640 –h 480 –mux


*Note: The same options available as part of the original encode sample are also available in the new muxer solution.

If the encode target is set to “mpeg2” instead of “h264” the output container will be of the type mpeg.

The following code in “sample_encode.cpp” implements support for the new option:

	else if (0 == _tcscmp(strInput[i], _T("-mux")))
	       pParams->bMux = true;

Integration of new Muxer class

The muxing capability is integrated into the exiting sample_encode project by sub classing the CSmplBitStreamWriter class. The new subclass is named MuxWriter and it implements the complete mp4 and mpeg muxing functionality.

To integrate with the new muxer class the following changes are made to “pipeline_encode.cpp” and “pipeline_user.cpp”. The code checks if the muxing option is enabled and instantiates the appropriate object type:

		m_pFileWriter = new CSmplBitstreamWriter();
		sts = m_pFileWriter->Init(pParams->strDstFile);
		m_pFileWriter = new MuxWriter();
		sts = static_cast<MuxWriter*>(m_pFileWriter)->Init(

As can be seen in the code above, the file writer object has changed to a pointer type instead of a directly accessed object in “pipeline_encode.h”.

	CSmplBitstreamWriter *m_pFileWriter;

Also note the new muxing related parameters required by the MuxWriter class.


Muxer class

The majority of the code changes are in the “sample_common” project. The new muxer class MuxWriter is a sub class of the CSmplBitStreamWriter and is implemented in the existing “sample_utils.cpp” file.

In the sample_utils.h include file we must first include all the required header files from the Intel® IPP samples audio video codec components package.


	#include "umc_structures.h"
	#include "umc_muxer.h"
	#include "umc_mp4_mux.h"
	#include "umc_file_writer.h"
	#include "umc_mpeg2_muxer.h"
	#include "umc_mpeg2_muxer_chunk.h"


Then we need to define the new MuxWriter class as follows:

	class MuxWriter : public CSmplBitstreamWriter
	public :
		virtual ~MuxWriter();

		virtual mfxStatus Init(const TCHAR *strFileName,
					const mfxU16 nWidth,
					const mfxU16 nHeight,
					const mfxF64 dFrameRate,
					const mfxU16 nBitRate,
					const mfxU32 nCodecId);
		virtual mfxStatus WriteNextFrame(
                 mfxBitstream *pMfxBitstream,
                 bool isPrint = true);
		virtual void Close();

		UMC::Muxer			*m_pMuxer;
		UMC::MuxerParams		*m_pMuxerParams;
		UMC::FileWriter		m_writer;
		UMC::FileWriterParams	m_fwp;
		UMC::VideoStreamInfo	m_videoInfo;
		UMC::MediaData		m_videoData;
		Ipp32s				m_videoTrackID;
		Ipp64f				m_frameDuration;

From the above code you can see that we utilize the base, mp4 and mpeg2 muxer capabilities from the Intel® IPP sample code. Also note that the MuxWriter initialization parameters such as bit rate, dimensions, and frame rate, are all required as meta-data to the muxer.

We implement the full muxer behavior in the MuxWriter class part of “sample_utils.cpp”. There are three key operations involved in using the Intel® IPP Samples muxer components.

    1. Init: Initialize the muxer with stream information such as type, bit rate, dimensions and buffer size
    2. PutVideoData: This operation provides the muxer with data for one frame (alternatively LockBuffer and UnlockBuffer could be used to achieve the same result)
    3. PutEndOfStream: This operation effectively completes the muxing into the media container

      The code below shows how these operations are used in the context of the MuxWriter class.

      	void MuxWriter::Close()
      			m_pMuxer->PutEndOfStream(m_videoTrackID); //Mark end of muxing. Close file
      	    m_bInited = false;
      	    m_nProcessedFramesNum = 0;
      	mfxStatus MuxWriter::Init(	const TCHAR *strFileName,
      					const mfxU16 nWidth,
      					const mfxU16 nHeight,
      					const mfxF64 dFrameRate,
      					const mfxU16 nBitRate,
      					const mfxU32 nCodecId)
      		UMC::Status umcRes = UMC::UMC_OK;
      		// Select muxer type
      		if(MFX_CODEC_AVC == nCodecId)
      			m_pMuxer		= new UMC::MP4Muxer();
      			m_pMuxerParams	= new UMC::MuxerParams(); //Initialize appropriate muxer object and set overall video and stream type of the container
      			m_videoInfo.stream_type		= UMC::H264_VIDEO;
      			m_pMuxerParams->m_SystemType	= UMC::MPEG4_SYSTEM_STREAM;
      		else if(MFX_CODEC_MPEG2 == nCodecId)
      			m_pMuxer		= new UMC::MPEG2Muxer();
      			m_pMuxerParams	= new UMC::MPEG2MuxerParams();
      			m_videoInfo.stream_type		= UMC::MPEG2_VIDEO;
      			m_pMuxerParams->m_SystemType	= UMC::MPEG2_PURE_VIDEO_STREAM;
      			return MFX_ERR_UNSUPPORTED;
      		// Initialize file writer
      		char tempStr[UMC::MAXIMUM_PATH];
      		wcstombs(tempStr, strFileName, UMC::MAXIMUM_PATH);
      		strcpy((char *)m_fwp.m_file_name, tempStr);
      		m_fwp.m_portion_size = 0;
      	    umcRes = m_writer.Init(&m_fwp);
      		// Video info - Configure video stream information such as bit rate and resolution.
      		m_videoInfo.clip_info.height	= nHeight;
      		m_videoInfo.clip_info.width	= nWidth;
      		m_videoInfo.bitrate			= nBitRate;
      		m_videoInfo.framerate		= dFrameRate;
      		m_videoInfo.streamPID		= UMC::IdBank::NO_ID;
      		// Muxer config
      		m_pMuxerParams->m_lpDataWriter	= &m_writer;
      		m_pMuxerParams->m_nNumberOfTracks	= 1;
      		m_pMuxerParams->pTrackParams	=
      	            new UMC::TrackParams[m_pMuxerParams->m_nNumberOfTracks]; //Since we are only muxing video into the container there is only one track,
      		m_pMuxerParams->pTrackParams[0].type		= UMC::VIDEO_TRACK; //Configure muxer internal buffer parameters.
      		m_pMuxerParams->pTrackParams[0]	= &m_videoInfo;
      		m_pMuxerParams->pTrackParams[0].bufferParams.m_prefOutputBufferSize = 1000000;
      		m_pMuxerParams->pTrackParams[0].bufferParams.m_prefInputBufferSize	 = 1000000;
      		m_pMuxerParams->pTrackParams[0].bufferParams.m_numberOfFrames	 = 30;
      		m_frameDuration = (Ipp64f)1/dFrameRate;  //Frame duration; required for time stamps.
      		// Initialize muxer
      		umcRes = m_pMuxer->Init(m_pMuxerParams);
      		// Initial alloc of memory for muxer data buffer. extended runtime if needed
      		m_videoData.Alloc(400000);	//buffer to hold encoded video frame
      	    m_bInited = true;
      	    return MFX_ERR_NONE;
      	mfxStatus MuxWriter::WriteNextFrame(mfxBitstream *pMfxBitstream, bool isPrint)
      		UMC::Status umcRes = UMC::UMC_OK;
      		umcRes = m_videoData.SetDataSize(pMfxBitstream->DataLength);
      		if(UMC::UMC_OK != umcRes) {
      			// If larger buffer is needed
      			umcRes = m_videoData.Alloc(pMfxBitstream->DataLength);
      			if(UMC::UMC_OK == umcRes)
      				umcRes = m_videoData.SetDataSize(pMfxBitstream->DataLength);
      		memcpy(m_videoData.GetDataPointer(), //Copy Intel® Media SDK encoded bitstream (frame) into muxer frame buffer
      			pMfxBitstream->Data + pMfxBitstream->DataOffset,
      		// Muxer requires time stamps
      		// (Adding timestamp offset to oblige Windows Media Player)
      		Ipp64f tss = m_nProcessedFramesNum * m_frameDuration + m_frameDuration; //time stamps required by container
      		umcRes = m_videoData.SetTime(tss, tss + m_frameDuration);
      		umcRes = m_pMuxer->PutVideoData(&m_videoData, 0); //Mux frame buffer into container
      	    // mark that we don't need bit stream data any more
      	    pMfxBitstream->DataLength = 0;
      	    if (isPrint && 0 == (m_nProcessedFramesNum - 1) % 100){
      	              _tcprintf(_T("Frame number: %hdr"), m_nProcessedFramesNum); 
      	    return MFX_ERR_NONE;

      Note that the above code provides an example on how to use the Intel® IPP sample muxer components together with Intel® Media SDK. There are several additional muxer options a developer can explore. For further details refer to the Intel® IPP samples source code or document located in the “doc” folder under “audio-video-codecs”.

      The “audio-video-codecs” (sometimes also named Unified Media Classes, or UMC) also provides a wide range of other codec components such as demuxers(splitters) and audio codecs that could be used in connection with Intel® Media SDK.


      Additional changes

      To ensure smooth playback (and quick repositioning) for playback on most common media players we also recommend changing the default GOP encoder settings in InitMfxEncParams (in “pipeline_encode.cpp”).

      As an example we choose an interval of 30 frames between each I frame.


      	if(pInParams->CodecId == MFX_CODEC_MPEG2)  // MPEG2
      		// Required for MPEG to insert sequence header before every I frame
      		m_mfxEncParams.mfx.IdrInterval = 1;	
      		m_mfxEncParams.mfx.GopPicSize = 16;
      		m_mfxEncParams.mfx.GopRefDist = 3;
      	else   // H.264
      		m_mfxEncParams.mfx.GopPicSize = 30;
      		m_mfxEncParams.mfx.GopRefDist = 1;



      In this article we presented a method to integrate Intel® Media SDK with Intel® IPP samples to enable developers to enhance their Intel® Media SDK solutions with muxing capabilities. A developer can use this method to quickly implement muxing support into a Intel® Media SDK solution.

      This is not the only way of integrating muxing capabilities with Intel® Media SDK. There are many other third-party tools or methods that can be used to perform muxing. However the presented method provides good flexibility and tight integration.

      For developer questions on how to use Intel® Media SDK in general please refer to our Intel® Media SDK forum on the Intel® Developer Zone site:



      For more complete information about compiler optimizations, see our Optimization Notice.