Changed API functions:
libobs: obs_reset_video
Before, video initialization returned a boolean, but "failed" is too
little information, if it fails due to lack of device capabilities or
bad video device parameters, the front-end needs to know that.
The OBS Basic UI has also been updated to reflect this API change.
NOTE: In texture_setimage, I had to move variables to the top of the
scope because microsoft's C compiler will give the legacy C90 error of:
'illegal use of this type as an expression'.
To sum it up, microsoft's C compiler is still utter garbage.
...I'm actually concerned that I went a bit overkill trying to prevent
backwards compatibility issues with this abstraction design, because
this is a large number of files that have to be modified just to add a
single graphics subsystem export. Someone's going to strangle me, and
when you know that someone might strangle you, that means that you did
something wrong. We'll have to look in to simplifying this in the
future without killing backward compatibility safety.
These functions were mostly related to being able to set true fullscreen
mode -- however, this has no place for our purposes, and these functions
were just sitting empty and unused, so they should be removed.
Besides, fullscreen mode only applies to the windows operating system.
This variable is currently somewhat pointless, I was originally going to
use it to tell the graphics subsystem to completely rebuild the internal
vertex buffers, but it would be bad/inefficient to allow that
functionality.
Previously we were using glGetAttribLocation on all inputs/outputs and
then just discarding if it was returned -1. However, we have a boolean
value of 'input' in gl_parser_attrib, so there's no need to be doing
this and discarding potentially useful error handling information.
BGRX was being treated as "BGR input" with "RGBA storage", where it
should have been "BGRA input" with "RGB storage". So the input for the
texture was expecting 24 bits of packed BGR rather than 32bit BGRX
pixels, and was internally storing it with alpha available.
My prior code was incorrect; I mixed up the two parameters, the
GL_TEXTURE_SWIZZLE_* parameter specifies the target channel, and the
value itself specifies the source channel., If that makes sense.
On some operating systems, with specific drivers it seems that BGR/BGRA
isn't properly treated as such in certain cases. This fix will
hopefully force the formats to be treated as BGR/BGRA when actually
rendering, which should get around the implementation-specific issue.
1) Fixed the preview window. It now correctly displays the source.
2) The GLX backend now correctly uses the devices current swap.
3) We now set device->cur_swap to a default so we don't have to check it in every function.
4) Minor syntactical cleanups and perhaps some messiness added.
This unfortunately re-introduces undesirable rendering behaviour for
slow renderers (e.g. first gen Intel HD graphics/Apple software
renderer) when the property window is open, but fixes property window
preview rendering for sufficiently fast renderers
- Add dummy GL texture support to allow libobs texture references to be
created for GL without
- Add a texture_getobj function to allow the retrieval of the
context-specific object, such as the D3D texture pointer, or the
OpenGL texture object handle.
- Also cleaned up the export stuff. I realized it was all totally
superfluous. Kind of a dumb moment, but nice to clean it up
regardless.