One of more the exciting new features of the iPhone 3GS is its faster, more advanced graphics hardware and support of OpenGL ES 2.0. Unfortunately, Apple hasn’t provided much info at all about how to harness those new capabilities. They provide great documentation and sample code for most of their APIs, but somehow their samples and info on OpenGL has always been pretty lackluster.
They don’t even provide a barebones sample or XCode template to get your started with OpenGL ES 2.0. If you want to take advantage of the new graphics capabilities, apparently it’s up to you to figure out how to use it. Don’t be fooled into thinking that OpenGL ES 2.0 is a minor upgrade over OpenGL ES 1.1 with only a couple new available functions. It’s a whole new beast! The fixed-function pipeline is gone, and it requires you to use shaders and be more familiar with the basics of computer graphics before you can even get a triangle on screen.
Given the total lack of documentation, I set out to create the most barebones app using OpenGL ES 2.0 on the iPhone. Something that could be used as a starting point by other developers in their own apps. I debated whether to create an app that displayed a spinning teapot or some other simple mesh, but I didn’t want to get lost in the details of loading a model or doing different transforms. So in the end, I decided to simply update the OpenGL ES 1.1 app that comes as part of the XCode template. The completed code is available for download here.
Yes, I know, not terribly exciting. It’s just a spinning quad! But it will be enough to cover the basics of initializing OpenGL, creating shaders, hooking them up to the program, and using them. In the end, we’ll even throw a little twist to add something that you could never do with OpenGL ES 1.1. Ready?
OpenGL initialization is almost exactly the same as with OpenGL ES 1.1. The only difference is that you need to tell it the new API version for ES 2.0:
context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
The rest of the details as far as EAGLView, creating back buffers, and such is all the same as it was before, so I’m not going to cover them here (but I’ll cover those things in detail in my upcoming Two-Day OpenGL Programming Class).
Keep in mind that when you initialize OpenGL ES 2.0, you won’t be able to call functions that are specific to OpenGL ES 1.1 If you try to use them, you’ll get a crash at runtime because they haven’t been set up properly. That means that if you want to take advantage of the 3GS graphics capabilities, but you also want to run on older models, you need to check at runtime which kind of device you’re using and enable either OpenGL ES 1.1 or 2.0 and have two very different code paths for each of them
OpenGL ES 1.1 uses the fixed-function pipeline to render polygons. With OpenGL ES 2.0, in order to render anything on screen, we need to write shaders. Shaders are little programs that run on the dedicated graphics hardware that transform the input data (vertices and states) into an image on screen. They are written in the OpenGL Shader Language (or GLSL for short), which will be very familiar to those of us used to C. You should learn the details of the language to take full advantage of its capabilities, but this is a short overview of the major concepts and how they’re related.
There are two types of shaders: vertex shaders which are executed at each vertex, and fragment shaders, which are executed for every pixel — well, technically, they’re executed at every fragment, which might not correspond to a pixel if you use antialiasing for example. For now, we can safely think of a fragment shader executing at each rendered pixel.
A vertex shader computes the position for a vertex in clip space. It can optionally compute other values so they can be used in a fragment shader.
There are two main types of inputs for a vertex shader: uniform and attribute inputs.
- Uniform inputs are values that are set once from the main program and applied to all the vertices processed by the vertex shader during a draw call. For example, the world view transform is a uniform input.
- Attribute inputs can vary for each vertex in the same draw call. For example, position, normal, or color information are attribute inputs.
Additionally, a vertex shader has two kinds of outputs:
- An implicit position output in the variable gl_Position. That’s where the vertex should be in clip space (later gets transformed into viewport space).
- Varying outputs. Variables defined with the varying attribute are going to be interpolated between vertices, and the resulting values will be passed as inputs to fragment shaders.
The vertex shader for the sample program simply transforms the vertex position from model space into clip space and passes the vertex color to be interpolated for the fragment shader:
uniform mat4 u_mvpMatrix;
attribute vec4 a_position;
attribute vec4 a_color;
varying vec4 v_color;
gl_Position = u_mvpMatrix * a_position;
v_color = a_color;
Fragment shaders compute the color of a fragment (pixel). Their input parameters are varying variables that were generated from the vertex shader, in addition to the gl_Position variable. The color computation can be as simple as writing a constant into gl_Position, or it can be looking up a texel value into a texture based on uv coordinates, or it can be a complex operation taking the lighting environment into account.
Our sample fragment shader couldn’t be any easier. It takes the color handed down from the vertex shader and applies it that fragment.
varying vec4 v_color;
gl_FragColor = v_color;
Maybe all of this sounds too vague and free-form, but that’s the beauty of shaders: There’s nothing pre-set, and how things are rendered is completely up to you (and the limitations of the hardware).
We have some shaders ready to go. How do we run them? We need to go through a few steps, the first of which is to compile and link them.
At runtime, we’ll need to load up the text for the source code for each vertex and fragment shaders, and compile with a few OpenGL calls.
const unsigned int shader = glCreateShader(type);
glShaderSource(shader, 1, (const GLchar**)&source, NULL);
Once the shaders are compiled, we need to create a shader program, add the shaders, and link them together.
m_shaderProgram = glCreateProgram();
The linking steps is what hooks up the outputs of the vertex shader with the expected inputs of the fragment shader.
You can detect if there were any errors during compilation or linking and display a message explaining the cause of the error.
glGetShaderiv(shader, GL_COMPILE_STATUS, &success);
if (success == 0)
glGetShaderInfoLog(shader, sizeof(errorMsg), NULL, errorMsg);
If the idea of compiling and linking programs at runtime bothers you, you’re not alone. Ideally, this should be a step that is done offline, just like compiling the source code for your main program in Objective C. Unfortunately Apple is trying to keep things open and allow for the format to change in the future, so they’re forcing us to compile and link on the fly. This is not just annoying, but it’s potentially quite slow once you start accumulating several shaders. Who wants to wait for a few more seconds while their app starts? For now, we just need to put up with this annoyance as part of the price to pay to use shaders on the iPhone.
Hooking Things Up
We’re almost ready to start using our shaders, but before we do that, we need to find out how to set the correct inputs. The vertex shader expects a mvp (model-view-projection) matrix to be set, as well as a stream of vertex data with positions and colors.
We do this by querying the shader program for the parameters we need by name. It returns a handle that we keep around so we can use it to set the values right before render the model.
m_a_positionHandle = glGetAttribLocation(m_shaderProgram, "a_position");
m_a_colorHandle = glGetAttribLocation(m_shaderProgram, "a_color");
m_u_mvpHandle = glGetUniformLocation(m_shaderProgram, "u_mvpMatrix");
Using The Shaders
Finally we’re ready to render some polygons with our shaders. All we have to do is enable the shader program…
…and set the correct input values using the handles to the input parameters we queried earlier:
glVertexAttribPointer(m_a_positionHandle, 2, GL_FLOAT, GL_FALSE, 0, squareVertices);
glVertexAttribPointer(m_a_colorHandle, 4, GL_FLOAT, GL_FALSE, 0, squareColors);
glUniformMatrix4fv(m_u_mvpHandle, 1, GL_FALSE, (GLfloat*)&mvp.m );
And now, we just call any of the render functions we’re familiar with from OpenGL ES 1.1 (glDrawArrays or glDrawElements):
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
If all went well, you should see your model rendered correctly on screen. In this case it’s just a quad, but you can use this as a starting point to render your own models with your own transforms, shaders, and effects.
A Little Something Extra
I admit that getting all the way here just to have the same spinning quad we had in the template for OpenGL ES 1.1 is quite underwhelming. Sure, it’s a necessary stepping stone to get started, but still. So, to show off how easy it is to create different effects with GLSL, here’s a modified pixel shader that spices things up a bit.
float odd = floor(mod(gl_FragCoord.y, 2.0));
gl_FragColor = vec4(v_color.x, v_color.y, v_color.z, odd);
This version of the fragment shader checks whether this pixel is at an even or odd line, and renders even lines as fully transparent, giving it an interlaced aspect on screen space. This is an example of an effect that would be quite difficult to achieve with OpenGL ES 1.1, but it’s just a couple of simple lines with 2.0.
Armed with the sample code and an idea of how shaders work, you should be able to start creating your own shaders and come up with some interesting and unique visuals in your games.
The completed code is available for download here.