Intended to replace libff as the media playback library. Intended to
use less threads and be more extensible. It was nearly impossible to
modify libff without bursting a vein.
In some cases the result of the compatability check is wrong.
For example the format "mpegts" only shows "mpeg2video" as an
encoder even though other codecs such as h.264 are supported by
ffmpeg's muxer for that container and are used within that container
in some applications.
Closesjp9000/obs-studio#804
FFmpeg by default decodes VP8/VP9 via its internal encoders, however
those internal encoders do not support alpha. Encoded alpha is stored
via meta/side data in the container, so the only way to decode it
properly is via forcing FFmpeg to use libvpx for decoding.
(Note: This commit also modifies the ipc-util/seg-service modules)
When compiling the final project, always compile
ipc-util/get-graphics-offsets/graphics-hook/inject-helper/seg-service
with static MSVC runtimes to prevent the need of requiring the MSVC
runtimes for both architectures.
(Also modifies obs-ffmpeg to handle empty frames on EOF)
Previously the demuxer could hit EOF before the decoder threads are
finished, resulting in truncated output. In the worse case scenario the
demuxer could read small files before ff_decoder_refresh even has a chance
to start the clocks, resulting in no output at all.
How to crash:
1. Use recent ffmpeg shared libraries.
2. Add a ffmpeg_source, a small static picture (e.g. jpeg) with loop
3. After a while of high cpu usage, it crashed. Seems reproduced more
easily on faster computer
Closes#533
There's no need to duplicate the packet as the reference count will be 1
after the av_read_frame call. Duplicating causes heap corruption when a
synthetic clock packet is duplicated and assigned the buffer from the
stack-based temporary packet which is then double-freed by the decoder
thread.
avformat_free_context() only frees the memory used by an AVFormatContext
but it does not close the opened media file. This causes a leaked file
descriptor every time a media source frees a demuxer. Using
avformat_close_input() instead frees the context and closes the media
file.
Fixes warnings with deprecated packet functions (av_free_packet and
av_dup packet, which were replaced by av_packet_unref and av_packet_ref
respectively)
Just in case glSwapIntervalEXT and glSwapIntervalSGI aren't available
for whatever reason. This entire patch is most likely completely
redundant on modern mesa drivers.
This allows plugins to update and cache data files from a remote source.
Here are the steps that occur when the API initiates an update check:
1.) It checks to see if the local files are greater than the cached
files. If the local version is newer (for whatever reason), it
replaces the cached version(s) with the local version.
2.) A packages.json file is downloaded from the specified URL. That
packages.json file contains a version number and a list of files to
be updated.
3.) If the downloaded package version is greater than the cached
version, executes step 4-5 on each file.
4.) Checks the version for the file to update in packages.json, and if
the version is greater than the cached version, proceeds to step 5,
otherwise repeat step 4-5 for other files.
5.) Calls the callback given to the update function (if any) with the
file information (file name, buffer, etc), and if the callback
returns true, allows the cached file to be updated and replaced,
otherwise goes back to step 4-6 for the rest of the files.
NOTE: Files are never modified directly. All file saving/modification
is performed in a temporary directory, and then files are moved to their
destination. This should eliminate any possibility of file corruption
(or at least dramatically reduce the possibility).
If the first guessed pts is less than the start_pts, it could
lead to a negative PTS being returned.
Change the behavior so that the first frame's pts, if zero, is
set to the start_pts. If more than one frame is less than the
start_pts, the start_pts is determined invalid and set to 0.
Valid start_pts example:
start_pts = 500
first frame (pts = 0)
pts = 500 (< start_pts)
pts -= 500 (offset by start_pts)
ret 0
second frame (pts = 700)
pts = 700 (no change, > start_pts)
pts -= 500 (offset by start_pts)
ret 200
Invalid start_pts example:
start_pts = 500
first frame (pts = 0)
pts = 500 (< start_pts)
pts -= 500 (offset by start_pts)
ret 0
second frame (pts = 300)
pts = 300 (< start_pts, start_pts set to 0)
pts -= 0 (start_pts is now 0)
ret 300
ff_clock_init expects a parameter with a pointer where it stores the
address of the newly allocated ff_clock, but ff_demuxer_reset does not
provide this parameter. That somehow writes the pointer to the ff_clock
into the packet->base->buf field on the stack of the ff_demuxer_reset
function. This later causes a segmentation fault when the packet is freed.
Closesjp9000/obs-studio#448
This was the reason why game capture could not hook when the hook was
run at administrator level and the game/target was below administrator
level: it was because the plugin created a pipe, and the hook tried to
connect to that pipe, but because the pipe was created as administrator
with default access rights, the pipe did not allow write access for
anything below administrator level, therefor the hook could not connect
to the plugin, and the hook would always fail as a result.
This fixes the issue by creating the pipe with full access rights to
everyone instead of default access rights.
Certain input streams (such as remote streams that are already active)
can start up mid-stream with a very high initial timestamp values.
Because of this, it would cause the libff timer to delay for that
initial timestamp, which often would cause it to not render at all
because it was stuck waiting.
To fix the problem, we should ignore the timestamp difference of the
first frame when it's above a certain threshold.
Now that we're using the timestamps from the stream for playback,
certain types of streams and certain file formats will not start from a
pts of 0. This causes the start of the playback to be delayed. This
code simply ensures that there's no delay on startup. This is basically
the same code as used in FFmpeg itself for handling this situation.
Removed code where if a PTS diff was greater than a certain
threshold it was forced to the previous PTS diff. This breaks
variable length frame media like GIF.
Always use -fPIC when not on WIN32 or APPLE and not just with gcc.
This allows for building obs with clang on linux and FreeBSD
without explicitly specifying -fPIC as compiler flag to cmake.
This adds utility functions for determining which
codecs and formats are supported by loaded FFMpeg
libraries. This includes validating the codecs that
a particular format supports.