Add API for streaming services. The services API simplifies the
creation of custom service features and user interface.
Custom streaming services later on will be able to do things such as:
- Be able to use service-specific APIs via modules, allowing a more
direct means of communicating with the service and requesting or
setting service-specific information
- Get URL/stream key via other means of authentication such as OAuth,
or be able to build custom URLs for services that require that sort
of thing.
- Query information (such as viewer count, chat, follower
notifications, and other information)
- Set channel information (such as current game, current channel title,
activating commercials)
Also, I reduce some repeated code that was used for all libobs objects.
This includes the name of the object, the private data, settings, as
well as the signal and procedure handlers.
I also switched to using linked lists for the global object lists,
rather than using an array of pointers (you could say it was..
pointless.) ..Anyway, the linked list info is also stored in the shared
context data structure.
Just wanted the ability to be able to add private data to the properties
data. Makes it a little easier to manage data if you get updates from
controls.
On OSX clicking the X title bar button immediately destroys "all" native
windows (after sending a close event) which causes
[NSSurface _disposeSurface] to crash if invoked while GL is using
the surface
This should fix GS rendering on surfaces on HiDPI displays; moving
windows between displays with differing pixel ratios currently requires
a manual resize
Previously the properties window would sometimes not receive
a closeEvent, leaving a dangling pointer in OBSBasicMain (and resulting
in a crash when trying to open a new properties window or closing
the application)
My prior code was incorrect; I mixed up the two parameters, the
GL_TEXTURE_SWIZZLE_* parameter specifies the target channel, and the
value itself specifies the source channel., If that makes sense.
The test program and test filter wasn't working properly because the ID
for it had actually changed.
The test programs also weren't updated for the new main render callbacks
which must be used when making a program.
Before, async video sources would flicker because they were only being
drawn when they were updated. So when updated, they'd draw that frame,
then it would stop drawing it until it updated again. This fixes that
issue and they should now draw properly.
Also, fix a few other minor bugs and issues relating to async video,
and make it so that non-async video filters can be properly applied to
them.
For the purposes of testing, change the 'test-random' source to an async
video source that updates every quarter of a second with a new random
face.
Also fix a bug where non-async video sources wouldn't have filter
effects applied properly.
A little bit of history about frame dropping:
I did a large number of experiments with frame dropping in old versions
of OBS1, and it's not an easy thing to deal with. I tried just about
everything from standard i-frame delay, to large buffers, to dumping
packets, to super-unnecessarily-complex things that just ended up
causing more problems than they was worth.
When I did my experiments, I found that the most ideal frame drop system
(in terms of reducing the amount of total data that needed to be
dropped) was in the 0.4xx days where I had a 3 second frame-drop buffer
where I could calculate the actual buffer size in bytes, and then
intellgently choose packets in that buffer to trim it down to a specific
size while minimizing the number of p-frames and i-frames dropped, and
preventing the actual impact of dropped frames on the stream. The
downside of it was that it required too much extra latency, and far too
many people complained about it, so it was removed in favor of the
current system.
The current system I just refer to just as 'packet dumping', which when
combined with low keyframe intervals (like most services use these
days), is the next-best method from my experience. Just dump the buffer
when you reach a threshold of buffering (which I prefer to measure with
time rather than in size), then wait for a new i-frame. Simple,
effective, and reduces the risk of consecutive buffering, while still
having fairly low impact on the stream output due to the low keyframe
interval of services.
By the way, audio will not (and should not ever) be dropped, lest you
end up with syncing issues (among other nasty things) specific to server
implementation.
On some operating systems, with specific drivers it seems that BGR/BGRA
isn't properly treated as such in certain cases. This fix will
hopefully force the formats to be treated as BGR/BGRA when actually
rendering, which should get around the implementation-specific issue.
- Fix an issue that could occur when using more than one video encoder.
Audio/video would not sync up correctly because they were expected to
be paired with a particular encoder. This simply adds a little
helper variable to encoder packets that specifies the system time in
microseconds. We then use that system time to sync
- Fix an issue with x264 with fractional FPS rates (29.97 and 59.94
particularly) where it would create ridiculously large stream
outputs. The problem was that you shouldn't set the timebase_*
variables in the x264 params manually, let x264 handle the default
values for it and leave them at 0.
- Make x264 use CFR output, because there's no reason to ever use VFR
in this case.