Buffering inside libobs is enabled by default and it causes latency of
video. This commit provides a property to switch the buffering.
Since the latency of the buffering changes every startup, set the
default to unbuffered mode, though the previous implementation was
enabled buffering.
When the mode is set to auto, the API sometimes send frames with 0x0
size. Without filtering such frames, which causes libobs to output error
messages.
This change fixes an issue with the CMIO DAL plugin where the CMIO
subsystem would log multiple errors when starting the virtual camera,
due to certain properties that could not be set (frame rate and format).
For now, we just ignore the assignment, but mark the property as
settable to suppress the error messages that are reported by the CMIO
subsystem.
This change fixes an issue in the Mach server exposed by the macOS
virtual camera for OBS, where it would not invalidate ports that were
disconnected by the remote application, causing sporadic crashes.
These crashes can be reproduced in the previous builds by opening the
virtual camera in a remote application and closing the application
(without stopping the virtual camera).
This change updates the implementation of the mac-virtualcam plugin to
not use any global state and instead rely on the state object that is
passed by the OBS module system.
This approach is similar to the virtual camera implementations for Linux
and Windows.
This change removes the unused CMSampleBuffer utility functions that
were still left from the previous implementation. Since we construct the
CMSampleBuffer directly from an IOSurface, we do not need any custom
construction logic anymore, since that is now performed by the OBS
plugin.
This change updates the mac-virtualcam implementation to conditionally
enable conversion of the output video format. Previously, the output
video was always converted into UYVY. However, this conversion exhibits
high CPU usage, as reported in:
https://github.com/johnboiles/obs-mac-virtualcam/issues/102
Therefore, we disable conversion when the selected output format (e.g.,
NV12) is natively supported by CoreVideo's pixel buffers.
This change updates the plugin to support video formats that contain
multiple planes (such as NV12). Such functionality is necessary to
prevent transcoding the raw video data, which is often delivered in a
planar format.
This change updates the mac-virtualcam implementation to pool the
CVPixelBuffers used to share the output frames. This allows the plugin
to recycle the pixel buffers used by the plugin.
This change updates the virtual camera implementation on macOS to
utilize IOSurface to share the output feed with the virtual cameras.
By using IOSurface, we remove the need for copying the frames across
multiple buffers, since they can be shared across Mach connections using
zero-copy.
This change fixes an issue where the DAL plugin would not load due to
not supporting the architecture arm64e. We update the build
configuration to build a universal binary that includes arm64e as well.
See https://github.com/obsproject/obs-studio/issues/6285 for more
information regarding this issue.
Sometimes when reconnecting the internal RTMP data is not cleared
(particularly the TLS data). This can cause TLS data to carry over from
one connection to another, causing issues with the secondary connection.
This adds a circular buffer to ffmpeg-mux when writing to a file.
Output from ffmpeg is buffered so that slow disk I/O does not block
ffmpeg writes, as this causes the pipe to become full and OBS stops
sending frames with a misleading "Encoding overloaded!" warning. The
buffer may grow to 256 MB depending on the rate of data coming in and
out, if the buffer is full OBS will start waiting in ffmpeg writes.
A separate I/O thread is responsible for processing the contents of
the buffer and writing them to the output file. It tries to process 1 MB
at a time to minimize small I/O.
Complicating things considerably, some formats in ffmpeg require seeking
on the output, so we can't just treat everything as a stream of bytes.
To handle this, we record offsets of each write and try to buffer as
many contiguous writes as possible. This unfortunately makes the code
quite complicated, but hopefully well commented.
The original PR missed assigning the `idx` variable in unregister. When
compiled without asserts this would silently not delete sources. Instead
correctly assign idx and skip unregistration if the source doesnt appear
registered.
fixes#6532
Some webcams, or other AVCaptureDevices like connected iOS devices which
are supported since 162450c, have an audio output which so far got
ignored.
Now, if supported, the audio will be captured alongside the video. For
existing sources, the properties have a button enabling this; while new
sources have it enabled by default.
Previously SPEAKER_4POINT0 was assigned to AV_CH_LAYOUT_QUAD, but later
was changed to AV_CH_LAYOUT_4POINT0 [1]. The change was forgotten in
obs-ffmpeg-mux. This is remedied here.
[1] 67e48ecc2c
Signed-off-by: pkv <pkv@obsproject.com>
The channel_layout API was overhauled by FFmpeg [1-4]. The previous
bitmask channel_layout is replaced by a struct ch_layout which combines
the number of channels, a bitmask and other infos. This struct must now
be supplied to AVframes since avutil >= 57.24.100 and to
AVCodecContext since avcodec 59.24.100 per (1].
This commit provides the required info to ffmpeg-mux,
obs-ffmpeg-output & to obs-ffmpeg-audio-encoders.
[1] Bump minor versions after the channel layout changes
cdba98bb80
[2] lavc: switch to the new channel layout API
548aeb9383
[3] avutil/channel_layout: Add a new channel layout API
086a804806
[4] avframe: switch to the new channel layout API db6efa18
db6efa1815
Signed-off-by: pkv <pkv@obsproject.com>