MFX_ERR_NOT_FOUND thrown in SyncOperation after hours of successful encoding

MFX_ERR_NOT_FOUND thrown in SyncOperation after hours of successful encoding

We are developing a C# application in Visual Studio that includes a CLR/C++ Native QuickSync project.  The QuickSync project creates a DLL that is referenced in the main C# application.  We have a full async pipeline on the C# level that passes raw frame data to multiple instances of the QuickSync encoder.  This has been working fairly well for us.

However, I am trying to track down an issue that occurs in the SyncOperation after a long time period of successful streaming (sometimes 1 hr sometimes 12 hrs).  I have attached my main .cpp file that handles the encoding process.  The error occurs on line 110 here:

encodeStatus = videoSession->SyncOperation(*syncPoint, 60000);   //<----error here
if(encodeStatus != MFX_ERR_NONE)
Console::WriteLine("Sync error!!");

if (!CheckResult((mfxStatus)encodeStatus, MFX_ERR_NONE))
{
lastErrorString = "Failed sync operation, Error status: " + encodeStatus.ToString();
return gcnew EncodedFrame();
}

The SyncOperation fails with a MFX_ERR_NOT_FOUND and never recovers.  The pipeline eventually dies as we always receive empty frames from the return gcnew EncodedFrame() line.

Does anyone have any ideas on why this might be happening?

Thanks!

The "Select media" button isn't working on the forum  (says I am not authorized) so I have pasted the file contents below:

// This is the main DLL file.
#include "stdafx.h"
#include "NativeQuickSyncEncoder.h"

using namespace System;
using namespace VolarVideo::EncodingPipeline;

//the contents of these buffers can be referenced by the muxer so this ensures they stick around
//could copy into new buffers for every packet but just more overhead and cleanup
static UINT8 SPSBufferStatic[MAXSPSPPSBUFFERSIZE];
static UINT8 PPSBufferStatic[MAXSPSPPSBUFFERSIZE];

#pragma region constructors and destructors
NativeQuickSyncEncoder::NativeQuickSyncEncoder()
{
// initialize pointers to null
mfxVer = NULL;
encoderParameters = NULL;
videoSession = NULL;
encoder = NULL;
surfaceBuffers = NULL;
encoderSurfaces = NULL;
bitstream = NULL;
syncPoint = NULL;
PPSBuffer = NULL;
SPSBuffer = NULL;
// status = MFX_ERR_NONE;

InitializeMfx();

// status = new mfxStatus();
}

NativeQuickSyncEncoder::~NativeQuickSyncEncoder()
{
Console::WriteLine("NativeQuickSync Dispose");

if (encoder != NULL)
{
encoder->Close();
delete encoder;
}
for (int idx = 0; idx < encoderSurfaceCount; idx++)
{
delete encoderSurfaces[idx];
}
MSDK_SAFE_DELETE_ARRAY(encoderSurfaces);
MSDK_SAFE_DELETE_ARRAY(surfaceBuffers);
if (bitstream != nullptr)
MSDK_SAFE_DELETE_ARRAY(bitstream->Data);

if (mfxVer != NULL)
delete mfxVer;
if (videoSession != NULL)
delete videoSession;
if (bitstream != NULL)
delete bitstream;
if (syncPoint != NULL)
delete syncPoint;
}

#pragma endregion

#pragma region public interface methods
EncodedFrame^ NativeQuickSyncEncoder::Encode(byte* Yinput, byte* UVinput)
{
try{
int surfaceIdx = GetFreeSurfaceIndex(encoderSurfaces, encoderSurfaceCount);
//MSDK_CHECK_ERROR(MFX_ERR_NOT_FOUND, surfaceIdx, MFX_ERR_MEMORY_ALLOC);
if (ErrorOccurred(surfaceIdx, MFX_ERR_NOT_FOUND))
{
status = MFX_ERR_MEMORY_ALLOC;
lastErrorString = "Failed to get free surface, Error code: " + surfaceIdx.ToString();
return gcnew EncodedFrame();
}

//we could do this but only if we can configure swscale to align its buffers correctly
//encoderSurfaces[surfaceIdx]->Data.Y = Yinput;
//encoderSurfaces[surfaceIdx]->Data.UV = UVinput;

//have to copy source data into memaligned encoder surface buffers
//The UV buffer is half the height due to the UV bytes being a 2x2 pixel area
memcpy(encoderSurfaces[surfaceIdx]->Data.Y, Yinput, bufferSurfaceWidth * inputHeight);
memcpy(encoderSurfaces[surfaceIdx]->Data.UV, UVinput, bufferSurfaceWidth * inputHeight / 2);

int encodeStatus = encoder->EncodeFrameAsync(NULL,
encoderSurfaces[surfaceIdx],
bitstream,
syncPoint);

if (MFX_ERR_NONE< encodeStatus && !syncPoint) // repeat if warning & no output
{
if (MFX_WRN_DEVICE_BUSY == encodeStatus)
Sleep(1);
}
else if (MFX_ERR_NONE < encodeStatus && syncPoint)
{
encodeStatus = MFX_ERR_NONE; // ignore warnings if output is available;

}
else if (MFX_ERR_NOT_ENOUGH_BUFFER == encodeStatus)
{
// Allocate more buffer. Might need to throw error.
}

if (MFX_ERR_NONE == encodeStatus)
{
encodeStatus = videoSession->SyncOperation(*syncPoint, 60000);
if(encodeStatus != MFX_ERR_NONE)
Console::WriteLine("Sync error!!");

if (!CheckResult((mfxStatus)encodeStatus, MFX_ERR_NONE))
{
lastErrorString = "Failed sync operation, Error status: " + encodeStatus.ToString();
return gcnew EncodedFrame();
}
EncodedFrame^ frame = gcnew EncodedFrame(bitstream->Data, bitstream->DataLength, bitstream->DataOffset,
bitstream->TimeStamp, bitstream->DecodeTimeStamp,
bitstream->FrameType,
SPSBuffer, SPSSize,
PPSBuffer, PPSSize);
bitstream->DataLength = 0;
nFramesProcessed++;
lastErrorString = "Success";
return frame;
}
}
catch(Exception^ e)
{
lastErrorString = "Exception caught: " + e->Message + " Stack Trace: " + e->StackTrace;
throw;
}
}

bool NativeQuickSyncEncoder::ChangeBitrate(int targetKbps)
{
// set up encoder parameters
mfxVideoParam* currentParams = new mfxVideoParam();

int sts = MFX_ERR_NONE;

memset(currentParams, 0, sizeof(&currentParams));

sts = (int) encoder->GetVideoParam(currentParams);
if (sts != 0)
return false;

/*scaling bitrate according to action parameters*/
currentParams->mfx.TargetKbps = targetKbps/1000;
currentParams->mfx.MaxKbps = targetKbps/1000;
//reseting encoder changing only bitrate
sts = (int) encoder->Reset(currentParams);

Console::WriteLine("Frame {0} : TargetBitrate = {1} Kbps\n", nFramesProcessed, currentParams->mfx.TargetKbps);

return ((sts == 0) ? true : false);
}

#pragma endregion

#pragma region private methods
void NativeQuickSyncEncoder::InitializeMfx()
{
mfxIMPL mfxImpl = QSImplementation;
mfxImplementation = new mfxIMPL(mfxImpl);// QSImplementation;
mfxVersion asdf = {0, 1};
mfxVer = new mfxVersion(asdf);

}

int NativeQuickSyncEncoder::InitializeEncoder(int bitrate, int width, int height, int gop_size)
{
syncPoint = new mfxSyncPoint();

MFXVideoSession vS;
videoSession = new MFXVideoSession(vS);

status = videoSession->Init(*mfxImplementation, mfxVer);
status = videoSession->QueryIMPL(mfxImplementation);
status = videoSession->QueryVersion(mfxVer);
Console::WriteLine("Using QuickSync implementation: {0} , Version: {1}.{2}\n",
*mfxImplementation==MFX_IMPL_SOFTWARE?"SOFTWARE":"HARDWARE", mfxVer->Major, mfxVer->Minor);

inputWidth = width;
inputHeight = height;

// set up encoder parameters
encoderParameters = new mfxVideoParam();
memset(encoderParameters, 0, sizeof(encoderParameters));

encoderParameters->mfx.CodecId = MFX_CODEC_AVC;
encoderParameters->mfx.TargetUsage = MFX_TARGETUSAGE_BALANCED;
encoderParameters->mfx.TargetKbps = bitrate/1000;
encoderParameters->mfx.RateControlMethod = MFX_RATECONTROL_CBR;
encoderParameters->mfx.FrameInfo.FrameRateExtN = 30;
encoderParameters->mfx.FrameInfo.FrameRateExtD = 1;
encoderParameters->mfx.FrameInfo.FourCC = MFX_FOURCC_NV12;
encoderParameters->mfx.FrameInfo.ChromaFormat = MFX_CHROMAFORMAT_YUV420;
encoderParameters->mfx.FrameInfo.PicStruct = MFX_PICSTRUCT_PROGRESSIVE;
encoderParameters->mfx.FrameInfo.CropX = 0;
encoderParameters->mfx.FrameInfo.CropY = 0;
encoderParameters->mfx.FrameInfo.CropW = width;
encoderParameters->mfx.FrameInfo.CropH = height;

encoderParameters->mfx.CodecProfile = MFX_PROFILE_AVC_CONSTRAINED_BASELINE;
encoderParameters->mfx.CodecLevel = MFX_LEVEL_AVC_31;
encoderParameters->mfx.GopPicSize = gop_size;
encoderParameters->mfx.GopRefDist = 1;
encoderParameters->mfx.GopOptFlag = MFX_GOP_CLOSED;

// Width must be a multiple of 16
// Height must be a multiple of 16 in case of frame picture and a multiple of 32 in case of field picture
encoderParameters->mfx.FrameInfo.Width = MSDK_ALIGN16(width);
encoderParameters->mfx.FrameInfo.Height = (MFX_PICSTRUCT_PROGRESSIVE == encoderParameters->mfx.FrameInfo.PicStruct)?
MSDK_ALIGN16(height) : MSDK_ALIGN32(height);

encoderParameters->IOPattern = MFX_IOPATTERN_IN_SYSTEM_MEMORY;

// create encoder
encoder = new MFXVideoENCODE(*videoSession);

// validate encoder parameters. if incompatible parameters detected, they will be changed
status = encoder->Query(encoderParameters, encoderParameters);
MSDK_IGNORE_MFX_STS(status, MFX_WRN_INCOMPATIBLE_VIDEO_PARAM);
MSDK_IGNORE_MFX_STS(status, MFX_WRN_PARTIAL_ACCELERATION);

if (status != MFX_ERR_NONE)
{
throw gcnew Exception("Error: setting up encoder parameters");
}

// query number of required surfaces for encoder
mfxFrameAllocRequest encoderRequest;

memset(&encoderRequest, 0, sizeof(encoderRequest));
status = encoder->QueryIOSurf(encoderParameters, &encoderRequest);

//MSDK_PRINT_RET_MSG(status);
//MSDK_CHECK_RESULT(status, MFX_ERR_NONE, status);

encoderSurfaceCount = encoderRequest.NumFrameSuggested;

// allocate surfaces for encoder
bufferSurfaceWidth = (mfxU16)MSDK_ALIGN32(encoderRequest.Info.Width);
bufferSurfaceHeight = (mfxU16)MSDK_ALIGN32(encoderRequest.Info.Height);
bitsPerPixel = 12;
surfaceSize = bufferSurfaceWidth * bufferSurfaceHeight * bitsPerPixel / 8;
surfaceBuffers = (mfxU8*)new mfxU8[surfaceSize * encoderSurfaceCount];
encoderSurfaces = new mfxFrameSurface1*[encoderSurfaceCount];
MSDK_CHECK_POINTER(encoderSurfaces, MFX_ERR_MEMORY_ALLOC);

for (int idx = 0; idx < encoderSurfaceCount; idx++)
{
encoderSurfaces[idx] = new mfxFrameSurface1;
memset(encoderSurfaces[idx], 0, sizeof(mfxFrameSurface1));
memcpy(&(encoderSurfaces[idx]->Info), &(encoderParameters->mfx.FrameInfo), sizeof(mfxFrameInfo));
encoderSurfaces[idx]->Data.Y = &surfaceBuffers[surfaceSize * idx];
encoderSurfaces[idx]->Data.U = encoderSurfaces[idx]->Data.Y + bufferSurfaceWidth * bufferSurfaceHeight;
encoderSurfaces[idx]->Data.V = encoderSurfaces[idx]->Data.U +1;
encoderSurfaces[idx]->Data.Pitch = bufferSurfaceWidth;

}

nFramesProcessed = 0;

// initialize encoder and check for errors
status = encoder->Init(encoderParameters);
MSDK_IGNORE_MFX_STS(status, MFX_WRN_PARTIAL_ACCELERATION);
MSDK_CHECK_RESULT(status, MFX_ERR_NONE, status);

// retrieve video parameters selected by the encoder
mfxVideoParam params;
memset(&params, 0, sizeof(params));

// external parameters for each component are stored in a vector
std::vector<mfxExtBuffer*> m_VppExtParams;
std::vector<mfxExtBuffer*> m_EncExtParams;

// extSPSPPS is used for ffmpeg muxing
PPSBuffer = PPSBufferStatic;
SPSBuffer = SPSBufferStatic;

mfxExtCodingOptionSPSPPS m_extSPSPPS;
memset(&m_extSPSPPS, 0, sizeof(mfxExtCodingOptionSPSPPS));
m_extSPSPPS.Header.BufferId = MFX_EXTBUFF_CODING_OPTION_SPSPPS;
m_extSPSPPS.Header.BufferSz = sizeof(mfxExtCodingOptionSPSPPS);
m_extSPSPPS.PPSBuffer = PPSBuffer;
m_extSPSPPS.SPSBuffer = SPSBuffer;
m_extSPSPPS.PPSBufSize = MAXSPSPPSBUFFERSIZE;
m_extSPSPPS.SPSBufSize = MAXSPSPPSBUFFERSIZE;

m_EncExtParams.push_back((mfxExtBuffer *)&m_extSPSPPS);
params.ExtParam = &m_EncExtParams[0];
params.NumExtParam = (mfxU16)m_EncExtParams.size();

status = encoder->GetVideoParam(&params);
MSDK_CHECK_RESULT(status, MFX_ERR_NONE, status);

// prepare bitstream buffer
bitstream = new mfxBitstream();

memset(bitstream, 0, sizeof(bitstream));
bitstream->MaxLength = params.mfx.BufferSizeInKB * 1000;
bitstream->Data = new mfxU8[bitstream->MaxLength];
MSDK_CHECK_POINTER(bitstream->Data, MFX_ERR_MEMORY_ALLOC);

//save away the updated sizes for later reference
PPSSize = m_extSPSPPS.PPSBufSize;
SPSSize = m_extSPSPPS.SPSBufSize;

return 0;
}

String^ NativeQuickSyncEncoder::GetEncoderVersion()
{
String^ versionString = String::Format("Using QuickSync implementation: {0} , Version: {1}.{2}",
*mfxImplementation==MFX_IMPL_SOFTWARE?"SOFTWARE":"HARDWARE", mfxVer->Major, mfxVer->Minor);

return versionString;
}

String^ NativeQuickSyncEncoder::GetLastErrorString()
{
return lastErrorString;
}

int NativeQuickSyncEncoder::GetBitrate()
{
mfxVideoParam currentParams;
//MSDK_ZERO_MEMORY(currentParams);

encoder->GetVideoParam(&currentParams);

return currentParams.mfx.TargetKbps;
}

int NativeQuickSyncEncoder::GetFreeSurfaceIndex(mfxFrameSurface1** surfacesPool, mfxU16 poolSize)
{
if (surfacesPool)
for (mfxU16 idx = 0; idx < poolSize; idx++)
if (surfacesPool[idx]->Data.Locked == 0)
return idx;
return MFX_ERR_NOT_FOUND;
}
#pragma endregion

7 posts / 0 new
Last post
For more complete information about compiler optimizations, see our Optimization Notice.

Thanks!  I'll follow along in that thread.

Hi Zach,

Are you using HW acceleration?  If you are, the issue is likely not related to the issue referenced above.

If you are using HW accelerated path please share details about your system configuration such as processor, graphics driver etc.  You can use the mediasdk_sys_analyzer tool to fetch such info.

Is MFX_ERR_NOT_FOUND really returned from SyncOperation()?   And not from GetFreeSurfaceIndex(), which would make more sense?

Regards,
Petter 

I am using MFX_IMPL_SOFTWARE so no HW acceleration.  According to my log files, GetFreeSurfaceIndex() fails with MFX_ERR_UNKNOWN and then the SyncOperation fails with MFX_ERR_NOT_FOUND.

I will double check my logic to ensure errors are being logged correctly.

Hi Zach,

thanks for confirming that you are using SW path.

If you look a the GetFreeSurfaceIndex() function you will see that it cannot return MFX_ERR_UNKNOWN  (at least not the default sample implementation or the code snippet you provided), so I suspect an issue with your log. Perhaps what is occurring is MFX_ERR_UNKNOWN return code from SyncOperation() and in that case possibly related to the issue raised by dj_alek?

Regards,
Petter 

Yes you are correct.  Sorry for reporting incorrectly.  The SyncOperation fails with MFX_ERR_UNKNOWN (-1) and then GetFreeSurfaceIndex fails with MFX_ERR_NOT_FOUND (-9).

Leave a Comment

Please sign in to add a comment. Not a member? Join today