> This gross oversight in the overengineered (especially for its time) MPEG-PS and MPEG-TS container formats just leaves me dumbfounded. If anybody knows why the MPEG standard doesn't just provide a byte size in the header of each frame or even just a FRAME_END code, or if you have a solution for this problem, let me know!<p>Because the video encoding was created in 1988 and the mux format in 1995 when large amounts of fast RAM were incredibly expensive and recording/transcoding and processing devices didn't always even have a framebuffer to store a full frame. Many many MPEG-1, MPEG-2 and even MPEG4 AVC Baseline limitations become very obvious when you consider that they were encoded on CPUs that might be slower than 150MHz and be decoded on devices which may only have a few macroblocks worth of storage for decoded frame.<p>> Interestingly, if I interpret the source correctly, ffmpeg chose the second option (waiting for the next PICTURE_START_CODE) even for the MPEG-TS container format, which is meant for streaming. So demuxing MPEG-TS with ffmpeg always introduces a frame of latency.<p>I think the confusion here is because MPEG-TS was created for broadcast TV streaming, not realtime streaming. Broadcast TV can easily be seconds behind the source these days and has probably travelled at least once from geostationary orbit so one frame really isn't something anyone cares about. The more modern HLS/DASH formats tend to be even worse at this, with many sources waiting for a full several-second long chunk to be complete before transmitting it to the viewer's device.
I have been wondering if we could built something on top of MPEG2 and AC3 and MP3, Codec with patents that had expired and something that is truly patents free. Which reminds me of Musepack, based on MPEG-1 Layer 2 [1]. Truly amazing quality at the time even when comparing to high bitrate AAC.<p>[1] <a href="https://www.musepack.net" rel="nofollow">https://www.musepack.net</a>
Love to see stuff like this. I wonder why he put all of the code in a header file, though... I've never seen that done before; it seems like it would make it impossible to invoke this from two separate source files?
How long until someone emscripten's this, so it runs directly in JS in the browser?<p>And how would that compare to others?<p><a href="https://jsmpeg.com/" rel="nofollow">https://jsmpeg.com/</a>