Can I place Mp4 metainfo at the beginning of the file?

Can I place Mp4 metainfo at the beginning of the file?

As I understand, ffmpeg by default places all meta info at the end of the file when generating Mp4s. Source: http://www.stoimen.com/blog/2010/11/12/how-to-make-mp4-progressive-with-qt-faststart/

What I need to do is create the video, one frame at a time and then stream it. This is impossible if the header info is at the 'end' of the file, which will never occur for video being streamed. So, is it possible to move this to the beginning or otherwise work around this? I would definitely prefer to not have to switch to another library and use it instead.. we're already over-budget on this project.

This seems to indicate that it is possible to stream ffmpeg mp4 files?

5 posts / 0 new
Last post
For more complete information about compiler optimizations, see our Optimization Notice.

Related to this question, does the Media SDK even generate this moov atom?  I'm just thinking that at least in my code, I call a bunch of MFXMuxer_PutBitstream's, during which time I doubt the moov atom is being generated and write to the file after every call.  Then I call a MFXMuxer_Close, which is where I assume this would be generated, except that I never write to the file afterward.  Is there a way to write that to the file using the Media SDK?

So upon further looking into this, it appears that the MediaSDK library does not support this, I'm hardcoding it but if you wanted to support streaming mp4s, it's a matter of changing this line in MFXMuxer_PutBitstream, probably as an option of your SDK, because I believe it increases the size of the video stream generated, since essentially you are writing a header for every frame:

av_interleaved_write_frame(in_mux->format_context, &packet)

to:

av_interleaved_write_frame(in_mux->format_context, NULL)

If the packet is NULL then it seems it will automatically write the information needed for fragmented mp4s http://www.ffmpeg.org/doxygen/trunk/group__lavf__encoding.html#ga37352ed...

As indicated here: http://libav.org/avconv.html#MOV_002fMP4_002fISMV

I'm not entirely sure this actually works however, since the issue I'm now having is that it turns out I haven't been actually saving the frames generated by MFX_PutBitstream to a file.  I had some misdirected pointers and anyway thought it was working when it was actually just saving the original h264 frames.  I'm wondering though, could you tell me how can I access the bitstream written to by MFXMuxer_PutBitstream, after I've called it, I don't see anything in the documentation.

I assume it has something to do it: mux->data_io->Read ?

I'm still checking into the location of the moov atom, but the ffmpeg integration code in the Media SDK tutorial package has been used in demos of live streaming.  If you output to UDP you can display the live stream with ffplay with what is included in the tutorial.

I've heard that sometimes it is possible to stream even without the moov atom, but often the video player will see that it doesn't have one and throw an error instead.. just not always and I hear it doesn't work in my case where I want to stream to an HTML5 video tag.

Regarding the other issue I mentioned, that is currently more important, I am not able to read the generated mp4 frames after calling MFX_PutBitstream, which I feel should but does not call dataIO->Write, correct?  Because it should write to the mfxMuxer's data_io object, which was set in the Init method to be the data_io that was passed into MFXMuxer_Init as mfxDataIO *data_io, right?

Leave a Comment

Please sign in to add a comment. Not a member? Join today