Unbuffered mode is causing the frames of media sources to potentially
have some slight jitter in playback, so instead of using unbuffered mode
with media sources, just leave buffering on. There may be a frame or so
of latency, but it shouldn't be noticeable to most users.
If a user has a tremendous amount of media files, this can cause
instability. Instead, make hardware decoding something the user has to
explicitly enable.
Although hardware decoding was technically enabled by default even
before we fixed it, fixing it was essentially a change to defaults for
users because it was just not even available before version 24.
Code submissions have continually suffered from formatting
inconsistencies that constantly have to be addressed. Using
clang-format simplifies this by making code formatting more consistent,
and allows automation of the code formatting so that maintainers can
focus more on the code itself instead of code formatting.
Input buffer by default is set to 2MB - for some sources, for example streaming
RTSP input over UDP, this is not enough and causes significant playback issues
that are not present while playing back the same source under ffplay/mpv.
It looks like someone actually started working on this feature as there are
translated strings, properties and everything ready - only the control was
missing from the UI. This commit adds that control.
Currently, the range is set to 1-16MB, step 1MB. This is somewhat arbitrary,
so suggestions to tweak this range with more real-world use cases are welcome.
(This commit also modifies deps/media-playback)
Before, the media-playback library would detect whether something was
seekable by checking the filename for "://", which is unideal because
there are other cases where targets may not be seekable. So instead, an
explicit "seekable" property (off by default) is now in place in the
media source when not in "local file" mode. Seeking will only be
enabled if local file mode is on, or if "seekable" is explicitly checked
by the user.
Closesjp9000/obs-studio#1022
There's no need to display this property at the moment, the default
amount is more than sufficient for most cases. That and most people
wouldn't know what to do with it anyway.
In the media source, do not pre-cache frames if the source is set to
close the file when inactive because that setting is designed to allow
the file to be replaced by the user. If it's replaced, it can
unintentionally keep the old precache frame, playing the frame from the
older video when it starts up.
Preloading is designed to overwrite the current internal render texture
of the source so it'll be ready to play the first frame right when the
source is first displayed.
However, When the media source is set to pause on the last frame, it
keeps that current render texture visible, so preloading it with the
first frame will essentially overwrite the current render, causing it to
inadvertently show and pause on the first frame rather than the last
frame. So instead, just disable preloading when the user has it set to
pause on the last frame.
(Note: This commit also modifies deps/media-playback)
Allows buffering network-based media sources where supported. Default
is two megabytes of buffering.
Due to a noticeable frequency of crashes inside of FFmpeg when using
hardware encoding on mac, this feature is going to be disabled for now
pending more investigation at a later time.
(This commit also modifies the decklink, linux-v4l2, mac-avcapture,
obs-ffmpeg, and win-dshow modules)
Originally, async buffering for sources was supposed to be a
user-controllable flag. However, that turned out to be less than ideal
because sources (such as the win-dshow plugin) were programmed with
automatic control over their buffering (such as automatically detecting
USB 2.0 capture devices and then enabling in those cases).
The fact that it was a flag caused a design flaw to where buffering
values would be overwritten when a source is loaded from save data.
Because of that, this flag is being deprecated and replaced with a
specific function to enable unbuffered mode instead.
This happens because the enum had the incorrect name, and microsoft
automatically treats all enums as integers in C, regardless of whether
they actually exist or not. Microsoft makes terrible compilers and
whoever decided this was a good idea should be fired.
Due to crashes being caused by hardware acceleration in the media source
on mac, disable hardware acceleration of the media source being on by
default for the time being.
(Also modifies obs-ffmpeg to handle empty frames on EOF)
Previously the demuxer could hit EOF before the decoder threads are
finished, resulting in truncated output. In the worse case scenario the
demuxer could read small files before ff_decoder_refresh even has a chance
to start the clocks, resulting in no output at all.
If the media source is set to restart on activation, it also shuts down
when not active. However, it would *always* start regardless of
active/inactive when the source is first created. It shouldn't do that,
it should start up only when it becomes active.
Certain types of sources (display captures, game captures, audio
device captures, video device captures) should not be duplicated. This
capability flag hints that the source prefers references over full
duplication.
Adds the option of making the media file restart when the source becomes
active (such as switching to a scene with it).
Due to lack of libff features to start/stop/pause/seek media files,
currently this just destroys the demuxer and recreates it. Ideally,
libff should have some functions to allow a more optimal means of doing
those things.
Reactors a bit of code related to starting up FFmpeg and makes it so the
initial view for the media source's properties displays the most
commonly desired settings.
Instead of the media source properties showing the URL mode by default
along with a whole bunch of properties that are confusing to most users,
starts on file mode and changes defaults to be a bit more sensible
related to file input.
Also, as a temporary measure for fixing color format issues (some video
files would display their color information incorrectly), forced format
conversion is now enabled by default, and has been moved to advanced
settings. Ideally, the actual bug causing color format issues in either
media-io or libff should be fixed at some point.
When browsing for a file, it would also just use *.* for the file
filter, which is a pain to use. This has been changed to use a
reasonable file filter related to common video/audio files so you don't
have to wade through non-media files just to select a media file. A
filter to show all files is still available as well.
API changed from:
obs_source_info::get_name(void)
obs_output_info::get_name(void)
obs_encoder_info::get_name(void)
obs_service_info::get_name(void)
API changed to:
obs_source_info::get_name(void *type_data)
obs_output_info::get_name(void *type_data)
obs_encoder_info::get_name(void *type_data)
obs_service_info::get_name(void *type_data)
This allows the type data to be used when getting the name of the
object (useful for plugin wrappers primarily).
NOTE: Though a parameter was added, this is backward-compatible with
older plugins due to calling convention. The new parameter will simply
be ignored by older plugins, and the stack (if used) will be cleaned up
by the caller.
Instead of using system timestamps for playback, use the timestamps
directly from the video/audio data to ensure all the data is synced up
using the obs_source back-end.
I think the original misconception when this was written was that OBS
would not handle timestamp resets/loops, which isn't the case; it will
actually handle all timestamp resets and abnormalities. It's always
best to use the obs_source back-end for all playback and syncing.
Some formats (like WMV) would send out audio packets that
had channels set but did not specify a channel layout.
Solution is to no longer rely on channel layout to get the
channels and just get the channel count directly off the
FFmpeg audio frame.