On mac and windows, the libraries are always meant to be portable,
therefore the naming convention does not need to change for each
revision. For linux this makes sense; windows and mac not so much.
Because libobs-opengl is a public library, it's customary to have SONAME
embedded in the library file. Also remove the prefix override and
remove the prefixing "lib" from the output name. This also requires us
to pass the library file name to dlopen invocations.
The glDebugMessageCallback function will set a callback that will relay
all messages coming from the driver (on supported drivers at least).
For nvidia, sometimes there are a lot of irrelevant messages, which is
nice depending on the type of application you're writing.
I actually at first thought these messages were important, but it turns
out that they're almost always irrelevant and not actually warnings.
Most of the messages at a certain type/severity are mostly for debugging
purposes or minor hints about how to maximize performance, so
unfortunately there ends up being a lot of pointless spam in the debug
output.
The particular performance messages are related to optimizations you can
get via multithreading, and are optimizations you would expect for games
more than for this type of application. They don't really apply for our
use cases most of the time.
High severity messages however are not omitted regardless of message
type.
These messages can be enabled again by simply defining the
SHOW_ALL_GL_MESSAGES macro.
This Fixes a minor flaw with the API where data had to always be mutable
to be usable by the API.
Functions that do not modify the fundamental underlying data of a
structure should be marked as constant, both for safety and to signify
that the parameter is input only and will not be modified by the
function using it.
Typedef pointers are unsafe. If you do:
typedef struct bla *bla_t;
then you cannot use it as a constant, such as: const bla_t, because
that constant will be to the pointer itself rather than to the
underlying data. I admit this was a fundamental mistake that must
be corrected.
All typedefs that were pointer types will now have their pointers
removed from the type itself, and the pointers will be used when they
are actually used as variables/parameters/returns instead.
This does not break ABI though, which is pretty nice.
This replaces the ARB_separate_shader_objects extension with traditional
linked shaders. I was able to get the existing system to use linked
shaders without having to change any libobs graphics API.
This essentially creates a linked list of shader programs with
references to the shaders they link. Before draw, it searches that
linked list for a particular pixel/vertex shader pair, and the linked
program associated with it. If no matching program exists, it creates
the program.
Changed API functions:
libobs: obs_reset_video
Before, video initialization returned a boolean, but "failed" is too
little information, if it fails due to lack of device capabilities or
bad video device parameters, the front-end needs to know that.
The OBS Basic UI has also been updated to reflect this API change.
NOTE: In texture_setimage, I had to move variables to the top of the
scope because microsoft's C compiler will give the legacy C90 error of:
'illegal use of this type as an expression'.
To sum it up, microsoft's C compiler is still utter garbage.
...I'm actually concerned that I went a bit overkill trying to prevent
backwards compatibility issues with this abstraction design, because
this is a large number of files that have to be modified just to add a
single graphics subsystem export. Someone's going to strangle me, and
when you know that someone might strangle you, that means that you did
something wrong. We'll have to look in to simplifying this in the
future without killing backward compatibility safety.
These functions were mostly related to being able to set true fullscreen
mode -- however, this has no place for our purposes, and these functions
were just sitting empty and unused, so they should be removed.
Besides, fullscreen mode only applies to the windows operating system.
This variable is currently somewhat pointless, I was originally going to
use it to tell the graphics subsystem to completely rebuild the internal
vertex buffers, but it would be bad/inefficient to allow that
functionality.
Previously we were using glGetAttribLocation on all inputs/outputs and
then just discarding if it was returned -1. However, we have a boolean
value of 'input' in gl_parser_attrib, so there's no need to be doing
this and discarding potentially useful error handling information.
BGRX was being treated as "BGR input" with "RGBA storage", where it
should have been "BGRA input" with "RGB storage". So the input for the
texture was expecting 24 bits of packed BGR rather than 32bit BGRX
pixels, and was internally storing it with alpha available.
My prior code was incorrect; I mixed up the two parameters, the
GL_TEXTURE_SWIZZLE_* parameter specifies the target channel, and the
value itself specifies the source channel., If that makes sense.