Abstract: This article explains how to Apply OpenGL Shaders on video frames with C++ Builder.
Introduction:
Before 2002, 2D image manipulation had to be done by dedicated hardware parts. Using hardware support made the solutions very specific, and new hardware had to be bought, when the need exceeded the current installations. When programmable graphics cards turned up in 2002, this seemed like a good solution that could deliver both speed through hardware, but still a serve as a general programmable platform. To use GPU programmable pip-line you have to create a program called “Shader program” Shaders are executed on the GPU not the CPU, now every GPU have its own processor (Core Clock) and RAM (Memory Clock) and Shaders are executed on those two.
You can think of shaders as built-in library implemented on the GPU core chip. Shaders can be programmed in C/C++ language. You can read the OpenGL Shading Language Specification from the OpenGL web site.
As I mentioned before shaders are programs that will be executed on the GPU so they have a main function. Shader program is divided into two kinds of shaders “Vertex Shader and Fragment Shader (Also known as Pixel Shader)” but this article is not about discussing shaders it self it is about how to create OpenGL shader enabled VCL application with C++ Builder.
Using shaders to do such complex computations in image processing or creating 3D visual effects is a great idea; I don’t know if a lot of C++ builder developers already know how to use OpenGL Shaders but for people who doesn’t I think this article will be a big help for them as it was big help for me to learn using shaders with BCB. So, enough talking about this and let us get to work.
GLEW:
As described in the glew web site “The OpenGL Extension Wrangler Library (GLEW) is a cross-platform open-source C/C++ extension loading library. GLEW provides efficient run-time mechanisms for determining which OpenGL extensions are supported on the target platform. OpenGL core and extension functionality is exposed in a single header file. GLEW has been tested on a variety of operating systems, including Windows, Linux, Mac OS X, FreeBSD, Irix, and Solaris.”
To use shaders in C++ Builder you will have first to download the glew library from here:
The downloaded ZIP file will include the binary file glew32.dll, headers glew.h and wglew.h, and finally the libraries glew32.lib and glew32s.lib. Both libraries are coff import library file (coff libraries are not compatible with C++ Builder) so first we have to convert them into omf import library file by using command line tool coff2omf:
1- Copy glew.h/wglew.h to “include\gl\”
2- Copy glew32.dll to “Window\system32\”
3- Copy glew32.lib to “Rad Studio\bin\”
4- Then start command line tool: “Start Menu->Run then Type “cmd”.
5- Go to the Rad Studio bin folder with:
“cd C:\Program Files\Embarcadero\RAD Studio\8.0\bin”.
6- Then type: “coff2omf glew32.lib glew32new.lib“.
The glew32new.lib is our new OMF import library that is compatible with C++ Builder so now copy this lib file to “lib\psdk\”. Bingo now we can start our first VCL OpenGL Shader Application.
Starting OpenGL Shader Sample Project:
Start a new VCL application and add the following controls:
Add a TPanel (Here we will use the TPanel instead of TForm, the Panel will be used to show the video frames affected by the Shader).
Add 3 TMemo objects and call them respectively VertexShader, FragmentShader, Messages.
VertexShader will hold the vertex shader source code.
FragmentShader will hold the fragment shader source code.
Messages will hold the compiling and linking shader results.
Add a TMediaPlayer for playing videos and getting stream frame indexes.
A TButton and call it ApplyShader. And add a TOpenDialog for opening the AVI file. Then add a TTimer and set the Interval property to 1.
Add TTrackBar and call it Saturation and set the Min and Max properties to 0 – 200.
Another thing is that you have to create a valid OpenGL context so I advise you if you don’t know how to create an OpenGL application using C++Builder please read this article by “John Ray Thomas” first Setting up OpenGL in C++Builder but in this case instead of using the TForm Handle we will use the Panel Handle. In this sample I am going to use VFW API (Video for Windows) to grab video frames from AVI video file so I recommend you to read this tutorial Playing AVI Files in OpenGL from the NeHe web site.
Because this article is dedicated only for showing developers how to use OpenGL Shaders with C++ Builder so I will not go deeper into explaining how to render AVI frames on an OpenGL surface.
Let’s start with data members declarations:
Header File:
//--------------------------------------------------------------------------- #ifndef OpenGLFormH #define OpenGLFormH //--------------------------------------------------------------------------- #include <Classes.hpp> #include <Controls.hpp> #include <StdCtrls.hpp> #include <Forms.hpp> #include <ExtCtrls.hpp> #include <Menus.hpp> #include <MPlayer.hpp> #include <Dialogs.hpp> #include <ComCtrls.hpp> #include <GL/glew.h> #include <float.h> #include <vfw.h> //For using AVI functions //--------------------------------------------------------------------------- class TForm1 : public TForm { __published: // IDE-managed Components TTimer *Timer1; TMainMenu *MainMenu1; TMenuItem *File1; TMenuItem *Open1; TMediaPlayer *MediaPlayer1; TOpenDialog *OpenDialog1; TPanel *Panel1; TPageControl *PageControl1; TTabSheet *Pixel; TTabSheet *Lib; TTabSheet *Vertex; TMemo *PixelMemo; TMemo * Messages; TMemo *VertexMemo; TTrackBar *Saturation; TButton *ApplyShader; void __fastcall FormCreate(TObject *Sender); void __fastcall FormDestroy(TObject *Sender); void __fastcall Timer1Timer(TObject *Sender); void __fastcall Open1Click(TObject *Sender); void __fastcall Panel1Resize(TObject *Sender); void __fastcall SaturationChange(TObject *Sender); void __fastcall ApplyShaderClick(TObject *Sender); private: // User declarations //OpenGL Data Members HDC hdc; HGLRC hrc; int PixelFormat; GLuint TexID; //OpenGL Shader Data Members GLuint PixelShader, VertexShader, Program; //AVI Data Members AVISTREAMINFO AVIStrInfo; PAVISTREAM AVIStream; PGETFRAME AVIFrame; int W,H; long LastAVIFrame; bool AVIInitDone; char* CurrentFrameData; LPBITMAPINFOHEADER BitInfHeader; public: // User declarations __fastcall TForm1(TComponent* Owner); void __fastcall RenderGLScene(); void __fastcall SetPixelFormatDescriptor(); void __fastcall SetupRC(); void __fastcall SetTextureMap(); //AVI Functions void OpenAVIStream(AnsiString AVIFile); void GetCurrentFrame(int FrameIndex); //OpenGL Shaders Functions void InitShaders(); void LoadShader(TStringList *aShadersource, GLuint &aShader); void ComplieShader(GLuint aShader); void ExecuteProgram(); }; //--------------------------------------------------------------------------- extern PACKAGE TForm1 *Form1; //--------------------------------------------------------------------------- #endif
PixelShader is the handler to the fragment shader. VertexShader is the handler to the vertex shader. Program is the handler to shader program. The three previous “GLuint” data members are the holders of the shader program objects.
The “InitShaders” function will initialize the glew library and create vertex and fragment shaders objects and the shader program it self. The “LoadShader” will load a shader text source from a “TStringList” object “aShaderSource” and store it into a specified shader object whether it is vertex or fragment shader. The “ComplieShader” compiles the shader passed to the function in the parameter “aShader” and attaches it to the shader “Program”. You can see that we did not added the shader Program as a parameter, it’s because we will use only one shader program in this example. Last but not least “LinkToProgram” will link the shader program and then execute it on the GPU. We will discuss these functions more in the next sections.
CPP File:
Texture Mapping Settings:
Before initializing the glew library you have to include the glew.h (Take into consideration that you can’t include both gl.h and glew.h on the same header file. Only one of them is available, but because we want to create shaders here so we call only glew.h and this header will call gl.h automatically for you.) And also add the glew32.lib to the project. Now let’s start with setting up texture mapping parameters for best quality using the next function.
//--------------------------------------------------------------------------- void __fastcall TForm1::SetTextureMap() { glPixelStorei (GL_PACK_ALIGNMENT,4); glGenTextures (1, &TexID); glBindTexture (GL_TEXTURE_2D, TexID); glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); } //---------------------------------------------------------------------------
In the SetTextureMap function glPixelStore sets up byte alignment for our bits. In this case I specify it as 4 byte alignment since each pixel is stored in 4-byte chunks. Then glGenTextures which sets up a unique value stored in a variable to correspond to the texture unit in memory. The glBindTexture binds the texture image, referenced by the value generated by glGenTextures, to a target. In this case a GL_TEXTURE_2D. The next calls to glTexParameteri sets up how the texture can be used by OpenGL. The two calls and arguments I supplied are sufficient for our needs and set up the texture in memory to be rendered in full quality.
Initializing the glew library and create Shader Objects:
In John Ray article after calling the function SetupRC in the form OnCreate event handler you will start initializing glew using these code expressions:
//--------------------------------------------------------------------------- void TForm1::InitShaders() { GLenum err = glewInit (); if (err != GLEW_OK) { MessageBox (HWND_DESKTOP, "GLEW is not initialized!", "Error", MB_OK | MB_ICONEXCLAMATION); } Program = glCreateProgram (); VertexShader = glCreateShader (GL_VERTEX_SHADER); PixelShader = glCreateShader (GL_FRAGMENT_SHADER); } //---------------------------------------------------------------------------
In InitShaders function the glewInit initializes the glew library to start using shader functions. Then we check to see if glew is initialized correctly. Now that the GLSL library is initialized and ready we start by creating the appropriate variables, glCreateProgram this function creates a valid program handler to be used as the shader program which will be executed on the GPU later.
The glCreateProgram creates an empty program object and returns a non-zero value by which it can be referenced. A program object is an object to which shader objects can be attached. This provides a mechanism to specify the shader objects that will be linked to create a program. It also provides a means for checking the compatibility of the shaders that will be used to create a program (for instance, checking the compatibility between a vertex shader and a fragment shader). When no longer needed as part of a program object, shader objects can be detached. Then we create both the vertex and fragment shaders handlers with the function glCreateShader but with specifying the shader type in the type parameter GL_VERTEX_SHADER and GL_FRAGMENT_SHADER.
The glCreateShader creates an empty shader object and returns a non-zero value by which it can be referenced. A shader object is used to maintain the source code strings that define a shader. ShaderType parameter indicates the type of shader to be created. Two types of shaders are supported. A shader of type GL_VERTEX_SHADER is a shader that is intended to run on the programmable vertex processor and replace the fixed functionality vertex processing in OpenGL. A shader of type GL_FRAGMENT_SHADER is a shader that is intended to run on the programmable fragment processor and replace the fixed functionality fragment processing in OpenGL.
//--------------------------------------------------------------------------- void __fastcall TForm1::FormCreate(TObject *Sender) { hdc = GetDC (Panel1->Handle); SetPixelFormatDescriptor (); hrc = wglCreateContext (hdc); wglMakeCurrent (hdc, hrc); SetupRC (); SetTextureMap (); InitShaders (); } //---------------------------------------------------------------------------
Opening the AVI file and getting frame data to texture:
//--------------------------------------------------------------------------- void TForm1::OpenAVIStream (AnsiString AVIFile) { AVIFileInit (); if (AVIStreamOpenFromFile (&AVIStream, AVIFile.c_str(), streamtypeVIDEO, 0, OF_READ, NULL) != 0) { // An Error Occurred Opening The Stream MessageBox (HWND_DESKTOP, "Failed To Open the AVI Stream", "Error", MB_OK | MB_ICONEXCLAMATION); } AVIStreamInfo (AVIStream, & AVIStrInfo, sizeof (AVIStrInfo)); W = psi.rcFrame.right - psi.rcFrame.left; H = psi.rcFrame.bottom - psi.rcFrame.top; LastAVIFrame = AVIStreamLength (AVIStream); Panel1->Width = W; Panel1->Height = H; AVIFrame = AVIStreamGetFrameOpen (AVIStream, (LPBITMAPINFOHEADER) AVIGETFRAMEF_BESTDISPLAYFMT); if (AVIFrame == NULL) { MessageBox (HWND_DESKTOP, "Failed To Open the AVI Frame", "Error", MB_OK | MB_ICONEXCLAMATION); } AVIInitDone = true; } //--------------------------------------------------------------------------- void __fastcall TForm1::Open1Click(TObject *Sender) { if (OpenDialog1->Execute()) { MediaPlayer1->FileName = OpenDialog1->FileName; MediaPlayer1->Open (); OpenAVI (OpenDialog1->FileName); //Change the time format to detect frame numbers instead of time. MediaPlayer1->TimeFormat = tfFrames; Timer1->Enabled = true; } } //--------------------------------------------------------------------------- void TForm1::GetCurrentFrame (int FrameIndex) { BitInfHeader = (LPBITMAPINFOHEADER) AVIStreamGetFrame (AVIFrame, FrameIndex); CurrentFrameData = (char *) BitInfHeader + BitInfHeader->biSize + BitInfHeader->biClrUsed * sizeof (RGBQUAD); } //--------------------------------------------------------------------------- void __fastcall TForm1::RenderGLScene () { glClear (GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glEnable (GL_TEXTURE_2D); glTexEnvf (GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL); glTexImage2D (GL_TEXTURE_2D, 0, GL_RGB, W, H, 0, GL_BGRA_EXT, GL_UNSIGNED_BYTE, CurrentFrameData); glBindTexture (GL_TEXTURE_2D, TexID); glBegin (GL_QUADS); glTexCoord2i (0, 0); glVertex2i (-(W/2.0), - (H/2.0)); glTexCoord2i (0, 1); glVertex2i (-(W/2.0), (H/2.0)); glTexCoord2i (1, 1); glVertex2i ((W/2.0), (H/2.0)); glTexCoord2i (1, 0); glVertex2i ((W/2.0), - (H/2.0)); glEnd (); glFlush (); glDisable (GL_TEXTURE_2D); } //---------------------------------------------------------------------------
Now that the texture has been bound and defined it can be created with the call glTexImage2D. This call brings together all of the parameters into the target GL_TEXTURE_2D and assigns the image bits to the variable that was first assigned to GL_TEXTURE_2D in the glBindTexture call.
In the RenderScene function we map a location from the texture to a vertex in the polygon, one for one. The value 1.0 is the complete stored image in the X or Y direction. A value of more than 1.0 can be used to map the texture numerous times to a plane. In this example I am going to use an OnTimer event handler instead of the system idle from the “John Ray Thomas”. Because I think it’s faster this way also setting the Interval to 1 will do great for speeding thing up.
//--------------------------------------------------------------------------- void __fastcall TForm1::Timer1Timer (TObject *Sender) { GetCurrentFrame (MediaPlayer1->Position); Caption = MediaPlayer1->Position; RenderGLScene (); SwapBuffers (hdc); } //---------------------------------------------------------------------------
GLSL Shaders and using them in the OpenGL application:
The first time I created an OpenGL Shader enabled application was using C++ Builder 2007, back then you was able to use hard C-style casts as you like but in later versions you couldn’t do that, so I faced a problem loading shaders from the TMemo source to the shader handler. As Asger Joergensen and Moritz Beutel explained to me in a thread I created trying to ask for help on why shaders worked fine with CB2007 and not with CB2009 and later versions. Asger said that “In CB2007 all the VCL TMemo, TEdit and TRichEdit, TStringList etc. contained AnsiString’s which in essence was an array of char, containing characters of the local codepage format.
From and including CB2009 that have changed, now every thing is build around UnicodeString’s, which in essence is an array of wchar_t containing characters in Utf16 format. The old short name for AnsiString String, have now changed to be a short name for UnicodeString.
AnsiString.c_str () returns char*
And
UnicodeString.c_str () returns wchar_t*”
Accordingly, I am going to present two versions of the same function one that works with CB2007 and another for later versions, but before that lets take a look at the shader code “GLSL” it self. It is the Shader source code that will be executed and applied on the OpenGL frame buffer, the shader code will adjust the colorfulness of the image using the Track bar placed earlier on the Form. The shader will adjust the saturation of the created OpenGL texture.
This code is going to be stored in the VertexShader, FragmentShader TMemo Objects placed on Form.
Vertex Shader Code:
This will be stored in the Strings property of VertexShader Memo.
//--------------------------------------------------------------------------- void main (void) { //The simple following instruction sets the vertex texture coordinate for texture unit 0 just by copying the texture coordinate specified in the OpenGL application. gl_TexCoord[0] = gl_MultiTexCoord0; //The projection matrix model view is multiplied by every vertex. gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } //---------------------------------------------------------------------------
Fragment Shader Code:
This will be stored in the Strings property of FragmentShader Memo.
uniform sampler2D TexID; uniform float Sat; //--------------------------------------------------------------------------- void main (void) { vec4 color = texture2D ( TexID, gl_TexCoord[0].st); vec3 lumCoeff = vec3(0.2125, 0.7154, 0.0721); vec4 lum = vec4(dot ( color.rgb, lumCoeff)); gl_FragColor = mix (lum, color, Sat); } //---------------------------------------------------------------------------
As you can see that each shader has its own main function. Actually, a shader program can only have one main function for each shader type. That is, you can have as many functions as you want in the vertex shader but only one main function can be used. The same thing applies for fragment shaders. You can read through the internet more about GLSL code but for now I am going to explain this code very quickly. First the fragment shader reads the texture color with the function texture2D as an array of type vec4 (r, g, b, a) then we multiply the lumCoeff by the texture color and what we get from this is the image luminance (Lightness). The saturation adjustment is done here by alpha blending the two layers, lum is the background and color is the foreground and the uniform variable Sat is the opacity. So, by decreasing the value of Sat to 0.0 will decrease saturation of image towards gray-scale, on the other hand increasing the Sat to 2.0 will make the image more vivid. A uniform variable can be manipulated from the host application (OpenGL Application) using a function called glUniform1fARB (will be explained later). In this example we will change the value of “Sat” using the track bar Saturation. When ever the value of the Position property changes the uniform variable Sat value will changes too.
//--------------------------------------------------------------------------- void TForm1::LoadShaderCB2007v (TStringList *aShadersource, GLuint &aShader) { char* ShaderSource = (char*) aShadersource->GetText(); GLint fLength = (GLint) StrLen ((const char*) ShaderSource); glShaderSource (aShader, 1, (const GLcharARB **)&ShaderSource, &fLength); } //--------------------------------------------------------------------------- void TForm1::LoadShaderCB2009AndLaterv (TStringList *aShadersource, GLuint &aShader) { AnsiString localCopy = AnsiString (aShadersource->Text); char* ShaderSource = localCopy.c_str (); GLint fLength = localCopy.Length (); glShaderSource (aShader, 1, (const GLcharARB **)&ShaderSource, &fLength); } //---------------------------------------------------------------------------
Both functions do the same thing but in a different way. Both will load a shader source into shader handler and will use the glew function glShaderSource that assigns a shader source to the specified shader object type. In the LoadShaderCB2007v version we grab the TStringList object aShaderSource text as an array of char and assign it to the ShaderSource variable then we compute the length of the shader string source stored in the variable ShaderSource. The glShaderSource function sets the source code in aShader to the source code in the array of strings specified by ShaderSource. Any source code previously stored in the aShader object is completely replaced.
On the other hand LoadShaderCB2009AndLaterv version we first convert the TStringList->Text property from UnicodeString to AnsiString then we grab the array of char and also assign it to ShaderSource variable. Then we get the length of the shader string but in a different way this time using the Length function of the AnsiString variable. After that we do the usual thing by using glShaderSource to import shader source into shader object. Next all we have to do is to compile and attach shaders to the shader program and then link the program and use it to be executed on the OpenGL frame buffer.
//--------------------------------------------------------------------------- void TForm1::ComplieShader (GLuint aShader) { glCompileShader (aShader); GLint status; glGetShaderiv (aShader, GL_COMPILE_STATUS, &status); GLint logLength = 0; glGetShaderiv (aShader, GL_INFO_LOG_LENGTH, &logLength); GLchar *log = new GLchar [logLength]; glGetShaderInfoLog (aShader, logLength, &status, log); Messages->Lines->Add (AnsiString (log)); glAttachShader (Program, aShader); } //---------------------------------------------------------------------------
Now what ComplieShader function do is? compiling the shader passed to it as the shader object aShader and then check for compiling result and add this result to the TMemo object Messages. The glCompileShader compiles the source code strings that have been stored in the shader object specified by shader.
The compilation status will be stored as part of the shader object’s state. This value will be set to GL_TRUE if the shader was compiled without errors and is ready for use and GL_FALSE otherwise. It can be queried by calling glGetShader with arguments shader and GL_COMPILE_STATUS. Compilation of a shader can fail for a number of reasons as specified by the OpenGL Shading Language Specification. Whether or not the compilation was successful, information about the compilation can be obtained from the shader object’s information log by calling glGetShaderInfoLog.
In order to create an executable, there must be a way to specify the list of things that will be linked together. Program objects provide this mechanism. Shaders that are to be linked together in a program object must first be attached to that program object. The glAttachShader function attaches the shader object specified by aShader to the Program object specified by program. This indicates that shader will be included in link operations that will be performed on Program.
All operations that can be performed on a shader object are valid whether or not the shader object is attached to a program object. It is permissible to attach a shader object to a program object before source code has been loaded into the shader object or before the shader object has been compiled. It is permissible to attach multiple shader objects of the same type because each may contain a portion of the complete shader. It is also permissible to attach a shader object to more than one program object. If a shader object is deleted while it is attached to a program object, it will be flagged for deletion, and deletion will not occur until glDetachShader is called to detach it from all program objects to which it is attached.
//--------------------------------------------------------------------------- void TForm1:: ExecuteProgram () { glLinkProgram(Program); GLint status; glGetProgramiv(Program, GL_LINK_STATUS, &status); GLint logLength = 0; glGetProgramiv (Program, GL_INFO_LOG_LENGTH, &logLength); GLchar *log = new Glchar [logLength]; glGetProgramInfoLog (Program, logLength, &status, log); Messages->Lines->Add (AnsiString (log)); glUseProgram (Program); } //---------------------------------------------------------------------------
Now that every thing related to shaders and their sources is done correctly all we have to do is to make the shader program be executed on the OpenGL frame buffer. And we will do that using two functions:
The glLinkProgram links the program object specified by program. If any shader objects of type GL_VERTEX_SHADER are attached to program, they will be used to create an executable that will run on the programmable vertex processor. If any shader objects of type GL_FRAGMENT_SHADER are attached to program, they will be used to create an executable that will run on the programmable fragment processor.
The status of the link operation will be stored as part of the program object’s state. This value will be set to GL_TRUE if the program object was linked without errors and is ready for use and GL_FALSE otherwise. It can be queried by calling glGetProgram with arguments program and GL_LINK_STATUS. As a result of a successful link operation, all active user-defined uniform variables belonging to program will be initialized to 0, and each of the program object’s active uniform variables will be assigned a location that can be queried by calling glGetUniformLocation. Also, any active user-defined attribute variables that have not been bound to a generic vertex attribute index will be bound to one at this time.
Next use is the glUseProgram installs the program object specified by program as part of current rendering state. One or more executables are created in a program object by successfully attaching shader objects to it with glAttachShader, successfully compiling the shader objects with glCompileShader, and successfully linking the program object with glLinkProgram.
A program object will contain an executable that will run on the vertex processor if it contains one or more shader objects of type GL_VERTEX_SHADER that have been successfully compiled and linked. Similarly, a program object will contain an executable that will run on the fragment processor if it contains one or more shader objects of type GL_FRAGMENT_SHADER that have been successfully compiled and linked.
Now the shader program is executed on the OpenGL frame buffer and you can see the effect of the shader after pressing the next OnClick event handler of the ApplyShader Button.
//--------------------------------------------------------------------------- void __fastcall TForm1::ApplyShaderClick(TObject *Sender) { LoadShader ((TStringList*) VertexMemo->Lines, VertexShader); LoadShader ((TStringList*) PixelMemo->Lines, PixelShader); ComplieShader (VertexShader); ComplieShader (PixelShader); LinkToProgram (); SatValueLocation = glGetUniformLocationARB (Program, "Sat"); glUniform1fARB (SatValueLocation, 1.0); } //---------------------------------------------------------------------------
All we do in this event handler is that we load both vertex and fragment shaders’ sources to their shader objects then compile, attach them to the shader program. Next is linking the program and executing it to frame buffer. The next two calls will get the uniform Sat variable location from Program and set its value to 1.0. In the next section I will demonstrate how to manipulate uniform variables within the host application using the track bar “Saturation” we created earlier.
Manipulate shader uniform variables from host application:
//--------------------------------------------------------------------------- void __fastcall TForm1::SaturationChange(TObject *Sender) { float SatPos = float(Saturation->Position) / 100.0; SatValueLocation = glGetUniformLocation (Program, "Sat"); glUniform1f (SatValueLocation, SatPos); } //---------------------------------------------------------------------------
The glGetUniformLocation returns an integer that represents the location of a specific uniform variable within a program object. The name must be a null terminated string that contains no white space. The name must be an active uniform variable name in program that is not a structure, an array of structures, or a subcomponent of a vector or a matrix. This function returns -1 if name does not correspond to an active uniform variable in program or if name starts with the reserved prefix “gl_”.
The location of the Sat uniform variable is stored in SatValueLocation then we use the glUniform1f to set the value of “Sat” using its location and the new value with SatPos. The commands glUniform {1|2|3|4} {f|i} are used to change the value of the uniform variable specified by location using the values passed as arguments. The number specified in the command should match the number of components in the data type of the specified uniform variable (e.g., 1 for float, int, bool; 2 for vec2, ivec2, bvec2, etc.).
Now that we are finished here you run and test the application. Viola here it is your first OpenGL shader Application using C++ Builder 2007/2009.
Conclusion:
In this article I have explained how to use OpenGL Shaders “GLSL”, manipulating them from the host application and rendering video frames on OpenGL Buffer in VCL application using C++ Builder 2007 or later. If you have any questions feel free to comment on this article. Hope next time I could talk about surface computing using C++ Builder.