While trying to experiment with the GL_ARB_get_program_binary extension, I found various sample code that seemed a little flowed, while the 'official' sample code in the spec is fine.
To sum it up, when using glGetProgramBinary to fetch the binary representation of the shader (program is GL wording), OpenGL provides an 'opaque' binaryFormat value that is meant to be kept with the binary data. This value is needed when calling glProgramBinary to construct a shader from the binary data.
For a basic usage, there is no need to enumerate the available binary formats using a glGet with GL_PROGRAM_BINARY_FORMATS.
To retrieve the shader binary, one must call :
GLenum binaryFormat = 0;
GLsizei binaryLength = 0;
glGetProgramBinary( program, BUFFER_SIZE, &binaryLength, &binaryFormat, buffer );
To reconstruct a shader with the binary data :
glProgramBinary( program, binaryFormat, buffer, binaryLength );
One more thing is that, in order to be able to get the binary representation, you need to hint the driver about it ; and this must be done before linking the shader. This is done by calling :
/* must come before glLinkProgram or glProgramBinary */
glProgramParameteri( program, GL_PROGRAM_BINARY_RETRIEVABLE_HINT, GL_TRUE );
So far so good, it seems to work fine on latest drivers by NVidia or AMD, while it seems that the performance gain is more important on AMD - my preliminary measures give me strange results, so I got to double check them.
8/31/2011
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment