Unexpected glGetActiveUniformName behavior

glGetActiveUniformName is a function that can be used to query the length of a single GLSL name according to the documentation . This can be done by setting the uniform value to NULL (0) and supply the GLsizei pointer as a variable length variable. That being said, one would expect the length of the uniform name to be returned in the length parameter, but this variable does not change after the function call, as described above. The error is GL_INVALID_VALUE, which, according to the documentation, can mean one of three things:

GL_INVALID_VALUE is generated if uniformIndex is greater than or equal to the value of GL_ACTIVE_UNIFORMS.

GL_INVALID_VALUE is generated if bufSize is negative.

GL_INVALID_VALUE is generated if program is not the name of a program object for which glLinkProgram has been issued.

      

In my case, uniform indexing is 0 and GL_ACTIVE_UNIFORMS messages are 5. bufSize is 0 and therefore cannot be negative. Calling glGetProgram with GL_LINK_STATUS in your program returns GL_TRUE (1). This rules out all 3 known causes of this error. There were no previous errors as per glGetError just before calling glGetActiveUniformName and calling it with valid bufSize and uniformName, both succeed without error and are written to the buffer at uniformName as expected; spelling the unified name correctly.

Besides the fact that the length of a single name is not reported and is misled, there are other warning signs; if bufSize is set to a positive value greater than zero, and Name remains NULL (0), the function writes to the uniform name anyway, which could cause severe memory corruption. This behavior is inconsistent with the spec, which states that uniformName should be NULL (0), then nothing should be written to it.

Putting all the information above together, I think I can safely conclude that there is no error in my code, but instead there is an error somewhere outside of my own code, so I am posting here. Is there anyone who can test the behavior of this function when used as described above?

I think this might be a driver issue, which means that only people working with nVidia drivers / hardware will see the problem, but it's too early to exclude any user group.

PS: The above happened on a machine with the latest drivers (specially updated to verify the above instructions) with an nVidia GeForce 660 GTX installed. The Dell XPS 8500 runs Windows 7 Professional SP1 64-bit. The program was created using Visual Studio 2012 Professional, completely updated. This post was also posted to the nVidia and DevMaster developer forums posted here for better reference and in the hopes of resolving the issue faster.

UPDATE 1: The code in question was tested on a different computer with a different GPU (nVidia GeForce 9600M GT), again with the latest drivers and getting identical results. This computer is running Windows 7 Ultimate SP1 64-bit. And again the binary was built with VS2012.

+3


source to share


1 answer


The documentation seems to be wrong here. The OpenGL specification is very clear here: length

- the number of characters actually written to the buffer. In addition, the specification does not require any NULL checks for the parameter uniformName

.

Or, to put it another way, you cannot use glGetActiveUniformName

a single name to get the length. You have to do it with glGetActiveUniform

.



I have corrected the wiki documentation .

+2


source







All Articles