Main page    

Astro Ducks - A Complete Game Project

Frame buffers

Framebuffer Objects is an OpenGL concept that allows us to render to other targets then the screen. By default, when your using OpenGL to render fragments the actual pixel-data ends up in a color buffer (the "back" buffer in a double-buffering system) and then finally on your screen. We can, however, create our own framebuffer objects if we want to rendering to something else - for example a 2D texture that we can then use when drawing a textured quad.

A framebuffer can be thought of as a series of 2D images that are connected to each other. A common setup is for example a color buffer and a depth buffer, as illustrated bellow (top part is the color buffer, bottom part the depth buffer)

Example frame buffer consisting of a color buffer and a depth buffer

We will be using a setup with a framebuffer contaning both a color and a depth buffer when creating our reflection texture. What we want to end up with is a texture that we can use to draw our water surface. So how do we do that?

Creating the framebuffer object

We implement two functions to create and cleanup our framebuffer objects (they can be found in renderToTexture.h)


bool GenerateFrameBufferAndTexture(int width, int height, GLuint &frameBuffer, GLuint &renderedTexture, GLuint &depthrenderbuffer);
void CleanUpFrameBufferAndAssociatedData(GLuint frameBuffer, GLuint renderedTexture, GLuint depthrenderbuffer);


Lets have a look at the code that creates the framebuffer object

bool GenerateFrameBufferAndTexture(int width, int height, GLuint &frameBuffer, GLuint &renderedTexture, GLuint &depthrenderbuffer)
	frameBuffer = 0;
	glGenFramebuffers(1, &frameBuffer);
	glBindFramebuffer(GL_FRAMEBUFFER, frameBuffer);

	renderedTexture = 0;
	glGenTextures(1, &renderedTexture);

	glBindTexture(GL_TEXTURE_2D, renderedTexture);

	glTexImage2D(GL_TEXTURE_2D, 0,GL_RGB, width, height, 0,GL_RGB, GL_UNSIGNED_BYTE, 0);


	// We want a depth buffer too
	depthrenderbuffer = 0;
	glGenRenderbuffers(1, &depthrenderbuffer);
	glBindRenderbuffer(GL_RENDERBUFFER, depthrenderbuffer);
	glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT, width, height);
	glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthrenderbuffer);	

	// Set "renderedTexture" as our color attachement #0
	glFramebufferTexture(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, renderedTexture, 0);	

	bool retValue = true;
	// Check that our framebuffer is ok
	GLenum frameBuffStatus = glCheckFramebufferStatus(GL_FRAMEBUFFER);

	if(frameBuffStatus != GL_FRAMEBUFFER_COMPLETE)
		retValue = false;	

	// Default lets render to the screen
	glBindFramebuffer(GL_FRAMEBUFFER, 0);

	return retValue;	

First, we generate the FBO (Framebuffer object) by calling glGenFramebuffers and then we bind it by calling glBindFramebuffer. We bind the framebuffer so that our future calls to glFramebufferRenderbuffer, glFramebufferTexture and glCheckFramebufferStatus will operate on the frame buffer object we just created.

After the FBO is created and bound, we create the 2D texture we will be rendering to. We want to use GL_NEAREST for GL_TEXTURE_MAG_FILTER/GL_TEXTURE_MIN_FILTER - we dont want any interpolation done on the texture. This since we are going to render to the texture, we dont want OpenGL to try and interpolate the texture we output - we want the "raw image" that we produced.

We also want to use a depth buffer. Hence, we create a rendering buffer using glGenRenderbuffers. We then bind the rendering buffer and create the depth buffer using glRenderbufferStorage - creating a depth buffer the same size as our color buffer. We need this buffer to support depth-buffering, same as we use a depth buffer when rendering the scene to the screen. We attach it to the framebuffer using glFramebufferRenderbuffer

Then we attach our texture as the "color attachment" of our framebuffer using glFramebufferTexture, this is what will make OpenGL render what is seen to our texture

Lastly, we check the "completeness" of our new frame buffer using glCheckFramebufferStatus. There are a couple of rules on how you can compose the 2D images that make up the frame buffer. For example, the buffers created must have the memory reserved to support it. You can read more about these rules at Frame Buffer (section "Framebuffer Completeness")

Before we return we reset the rendering to default framebuffer, that is the screen.

Using the framebuffer

Ok, so now that we have our framebuffer - how can we render using it? It's very simple, just bind the framebuffer object and change the viewport to match the size of the texture created in our GenerateFrameBufferAndTexture.

glBindFramebuffer(GL_FRAMEBUFFER, frameBufferObj);
glViewport(0, 0, textureWidth, textureHeight);

After that, all subsequent calls rendering to the color buffer will end up in your framebuffer, which in turn will store the results in the texture you specified. Simple as that!

Back to main page    Next section - Collision managment