Optimizing Ambient/Diffuse/Specular lighting shaders [update: now with video]

I’m thinking about how to optimize the lighting shaders in a game I’m working on. At the moment I’m getting fairly consistent 59/ 60 FPS on an iPad Air, with the occasional dip. I’ve not had a chance to test on older hardware yet, but I’m conscious that performance might take a dive. As per the Apple shader optimization guidelines, I have lots of shaders targeted for specific uses, ie depending on whether the mesh has a texture or not. At the moment I’m just using ambient + diffuse directional lighting (AD), but for certain, shall we say, “money shot” objects (ie important things that the player will spend a lot of time looking at), specular lighting would look really cool (ADS). My experience in the past though has been that adding specular lighting can be pretty expensive.

Whilst comparing various shader examples though, I came across this one, which does the specular calculation in the vertex shader and just does pixel interpolation in the fragment shader, whereas most others do the expensive specular calculations in the fragment shader, ie on a per-pixel basis:

The ADS Lighting Model

I thought I’d try porting it to Codea, but before I did I wanted to ask the 3D/ Open GL ES experts on this forum what they thought of this shader. I guess it would only really look good with mid to hi-poly objects (an eight vert cube would be no good). Would it represent a saving in performance terms though?

[EDIT you can ignore this paragraph, I just realised that in the other examples, although the specular function is defined outside of the main fragment function, it is called in main, so the specular calculation is being done for every pixel. So rephrasing my question, has anyone else tried doing the specular calculation in the vert shader, and then just using interpolation in the frag shader, as in the code below?] I guess what I’m really asking is, how often is the specular function executed when it is in the fragment shader, but outside the main function (the approach that most take), versus here, where it is in the vertex shader. I know that code inside the fragment main function is executed for every single pixel, but what about for outside the main function?

Here is the shader from the above link. A one-line fragment main function is pretty tempting! (you’d have to add more if you wanted textures of course)

Vert:

#version 130
 
// Our Model, View and Projection matrices
// we need them to transform the incoming vertices and for lighting
// calculation
uniform mat4 mModel;
uniform mat4 mView;
uniform mat4 mProjection;
 
// Position of the "Camera" and the Light
// in Model space
uniform vec4 vEyePosition;
uniform vec4 vLightPosition;
 
// The Colors of the material
uniform vec4 vAmbientMaterial;
uniform vec4 vDiffuseMaterial;
uniform vec4 vSpecularMaterial;
 
// Vertex properties
in vec4 vVertexPosition;
in vec3 vVertexNormal;
 
// Final color of the vertex we pass on to the next stage
smooth out vec4 vVaryingColor;
 
// Returns the specular component of the color
vec4 GetSpecularColor()
{
    // Transform the Vertex and corresponding Normal into Model space
    vec4 vTransformedNormal = mModel * vec4( vVertexNormal, 1 );
    vec4 vTransformedVertex = mModel * vVertexPosition;
 
    // Get the directional vector to the light and to the camera
    // originating from the vertex position
    vec4 vLightDirection = normalize( vLightPosition - vTransformedVertex );
    vec4 vCameraDirection = normalize( vEyePosition - vTransformedVertex );
 
    // Calculate the reflection vector between the incoming light and the
    // normal (incoming angle = outgoing angle)
    // We have to use the invert of the light direction because "reflect"
    // expects the incident vector as its first parameter
    vec4 vReflection = reflect( -vLightDirection, vTransformedNormal );
 
    // Calculate specular component
    // Based on the dot product between the reflection vector and the camera
    // direction.
    //
    // hint: The Dot Product corresponds to the angle between the two vectors
    // hint: if the angle is out of range (0 ... 180 degrees) we use 0.0
    float spec = pow( max( 0.0, dot( vCameraDirection, vReflection )), 32 );
 
    return vec4( vSpecularMaterial.r * spec, vSpecularMaterial.g * spec, vSpecularMaterial.b * spec, 1.0 );
}
 
// Ambient color component of vertex
vec4 GetAmbientColor()
{
    return vAmbientMaterial * vec4( 0.2, 0.2, 0.2, 1.0 );
}
 
// Diffuse Color component of vertex
vec4 GetDiffuseColor()
{
    // Transform the normal from Object to Model space
    // we also normalize the vector just to be sure ...
    vec4 vTransformedNormal = normalize( mModel * vec4( vVertexNormal, 1 ));
 
    // Get direction of light in Model space
    vec4 vLightDirection = normalize( vLightPosition - vTransformedNormal );
 
    // Calculate Diffuse intensity
    float fDiffuseIntensity = max( 0.0, dot( vTransformedNormal, vLightDirection ));
 
    // Calculate resulting Color
    vec4 vDiffuseColor;
    vDiffuseColor.xyz = vDiffuseMaterial.rgb * fDiffuseIntensity;
    vDiffuseColor.a = 1.0;
 
    return vDiffuseColor;
}
 
void main(void)
{
    vec4 ambientColor = GetAmbientColor();
    vec4 diffuseColor = GetDiffuseColor();
    vec4 specularColor = GetSpecularColor();
 
    // Combine into final color
    vVaryingColor = ambientColor + diffuseColor + specularColor;
 
    // Transform the vertex to MVP Space
    gl_Position = mProjection * mView * mModel * vVertexPosition;
}

Frag:

#version 130
 
out vec4 vFragColor;
in vec4 vVaryingColor;
 
void main(void)
{
    // Just use the interpolated color as our output
    // (Colors between vertices are interpolated)
   vFragColor = vVaryingColor;
}

@Ignatz ok, that makes a lot of sense. So in my ambient+diffuse only shader, where the norm calculations all take place in the vertex shader, I can just do this:

vec4 norm = normalize(modelViewProjection * vec4(normal,0.0));

@TheSolderKing I’ve never used the shader editor. If I were to port this, I’d put it in a regular project, as you’d need to apply it to some 3D models, and supply all of the variables it asks for, before you’d see anything. That’s probably beyond what the shader editor can do

It would help if you told us what you changed

@TheSolderKing There’re also it seems slightly different syntax for different implementations of open gl es. I’m definitely a bit hazy on this. ie vFragColor here would be gl_FragColor on iOS etc. With the inputs too, you could probably use uniform mat4 modelViewProjection rather than the separate matrices for model, view, and projection that you get here (though you might need to additionally pass the shader the modelView for the specular calculation)

Some of that code can be simplified, I’ll post soon

One thing I’ve never quite got my head around is why some lighting shaders, such as this one, need you to pass the modelView for multiplying the normals, whereas others just treat the normals the same as the positions, ie multiply them by the modelViewProjection matrix. I’ve tried both and I can’t really see any difference in my tests so far.

This set of code tabs shows a variety of lighting options including point and spot lights

https://gist.github.com/dermotbalson/90432375dfd790f89728

I’ve also written an ebook on lighting, here

https://gist.github.com/dermotbalson/90432375dfd790f89728

@Ignatz yes, your series of tutorials and the ebooks are absolutely required reading! The lighting shader printer that you have on Codea Community is also a fantastic way of experimenting with different lighting setups. Although I’m not using your lighting library, I have been adapting the shaders produced by the shader printer. I’ve also been using your importer for Blender .obj files. All absolutely essential resources for 3D.

You need to modify the normals depending on how you’re going to use them. If you use them in the vertex shader along with the position attribute, that attribute is in “model space”, like the normals, so they don’t need modification.

However, if you’re going to use the normals in the fragment shader, and compare them with lighting variables passed in as uniforms from Codea, then those variables are in world space, so you need to apply the model matrix to the normals in the vertex shader so they are in world space when they reach the fragment shader. (This is generally what I use, because Codea lighting settings - such as camera position in your scene - naturally belong in world space).

If your lighting code relies on coordinates in camera space, then you need to apply the view matrix to the normals as well.

So the question is - what am I going to use these normals with? What space is that in? That will tell you what conversion is needed.

I’ve done specular in the fragment shader on an iPad3 with no problems. I’ll post code soon, tied up right now.

I have been attempting to modify this code after you posted it, but didn’t get far before I found something that didn’t make sense. Here is it so far:


 
// Our Model, View and Projection matrices
// we need them to transform the incoming vertices and for lighting
// calculation
uniform mat4 mModel;
uniform mat4 mView;
uniform mat4 mProjection;
 
// Position of the "Camera" and the Light
// in Model space
uniform vec4 vEyePosition;
uniform vec4 vLightPosition;
 
// The Colors of the material
uniform vec4 vAmbientMaterial;
uniform vec4 vDiffuseMaterial;
uniform vec4 vSpecularMaterial;
 
// Vertex properties
attribute vec4 vVertexPosition;
attribute vec3 vVertexNormal;
 
// Final color of the vertex we pass on to the next stage
varying vec4 vVaryingColor;
 
// Returns the specular component of the color
vec4 GetSpecularColor()
{
    // Transform the Vertex and corresponding Normal into Model space
    vec4 vTransformedNormal = mModel * vec4( vVertexNormal, 1 );
    vec4 vTransformedVertex = mModel * vVertexPosition;
 
    // Get the directional vector to the light and to the camera
    // originating from the vertex position
    vec4 vLightDirection = normalize( vLightPosition - vTransformedVertex );
    vec4 vCameraDirection = normalize( vEyePosition - vTransformedVertex );
 
    // Calculate the reflection vector between the incoming light and the
    // normal (incoming angle = outgoing angle)
    // We have to use the invert of the light direction because "reflect"
    // expects the incident vector as its first parameter
    vec4 vReflection = reflect( -vLightDirection, vTransformedNormal );
 
    // Calculate specular component
    // Based on the dot product between the reflection vector and the camera
    // direction.
    //
    // hint: The Dot Product corresponds to the angle between the two vectors
    // hint: if the angle is out of range (0 ... 180 degrees) we use 0.0
    float spec = max( 0.0, dot( vCameraDirection, vReflection ))^ 32
 
    return vec4( vSpecularMaterial.r * spec, vSpecularMaterial.g * spec, vSpecularMaterial.b * spec, 1.0 );
}
 
// Ambient color component of vertex
vec4 GetAmbientColor()
{
    return vAmbientMaterial * vec4( 0.2, 0.2, 0.2, 1.0 );
}
 
// Diffuse Color component of vertex
vec4 GetDiffuseColor()
{
    // Transform the normal from Object to Model space
    // we also normalize the vector just to be sure ...
    vec4 vTransformedNormal = normalize( mModel * vec4( vVertexNormal, 1 ));
 
    // Get direction of light in Model space
    vec4 vLightDirection = normalize( vLightPosition - vTransformedNormal );
 
    // Calculate Diffuse intensity
    float fDiffuseIntensity = max( 0.0, dot( vTransformedNormal, vLightDirection ));
 
    // Calculate resulting Color
    vec4 vDiffuseColor;
    vDiffuseColor.xyz = vDiffuseMaterial.rgb * fDiffuseIntensity;
    vDiffuseColor.a = 1.0;
 
    return vDiffuseColor;
}
 
void main(void)
{
    vec4 ambientColor = GetAmbientColor();
    vec4 diffuseColor = GetDiffuseColor();
    vec4 specularColor = GetSpecularColor();
 
    // Combine into final color
    vVaryingColor = ambientColor + diffuseColor + specularColor;
 
    // Transform the vertex to MVP Space
    gl_Position = mProjection * mView * mModel * vVertexPosition;
}

When I put this code in the shader editor, the shader says ‘return:’ syntax error. Anyone know why?

OK, here’s my attempt at an Ambient/Diffuse/Specular lighting shader which does the expensive calculations in the vertex shader, and mainly does bits of interpolation in the fragment shader.

It’s adapted from @Ignatz’s ADS shader, with the idea of doing the specular calculation in the vertex shader taken from Rath’s code at the top of the page.

The idea is for it to be as streamlined as possible, so there’s no coloured lighting or anything like that. It does however render the colour of the mesh (in addition to its texture).

Bindings are just:

  • ambient 0-1, strength of ambient light

  • eye, vec4 camera position

  • light, vec4 normalized light direction

I haven’t extensively tested it, but with mid to hi-poly objects, it looks good, difficult to distinguish from pixel-level specular shading (with low poly objects, an 8 vert cube or whatever, you can’t see a specular highlight, it looks like ambient-diffuse lighting). I haven’t benchmarked it, but in theory it should be faster.

    ADSTex = shader(
    [[
    
    uniform mat4 modelViewProjection;
    uniform float ambient; // --strength of ambient light 0-1
    uniform vec4 eye; // -- position of camera (x,y,z,1)
    uniform vec4 light; //--directional light direction (x,y,z,0)
    
    attribute vec4 position;
    attribute vec4 color;
    attribute vec2 texCoord;
    attribute vec3 normal;
    
    varying lowp vec4 vColor;
    varying highp vec2 vTexCoord;
    varying vec4 vDirectDiffuse;
    varying vec4 vSpecular;
  
    void main()
    {
        vec4 norm = normalize(modelViewProjection * vec4( normal, 0.0 ));
        vDirectDiffuse = vec4(1.0,1.0,1.0,1.0) * max( 0.0, dot( norm, light )); // ambient color    
        vec4 vPosition = modelViewProjection * position;
    
        //specular blinn-phong
        vec4 cameraDirection = normalize( eye - vPosition );
        vec4 halfAngle = normalize( cameraDirection + light );    
        float spec = pow( max( 0.0, dot( norm, halfAngle)), 5. );//last number is specularPower, higher number = less power
        vSpecular = vec4(1.,1.,1.,1.)  * spec; // add optional shininess at end here
    
        vColor = color * ambient; 
        vTexCoord = texCoord;
        gl_Position = vPosition;
    }
    
    ]],
    
     [[
    
    precision highp float;
    
    uniform lowp sampler2D texture;

    varying lowp vec4 vNormal;
    varying lowp vec4 vColor;
    varying highp vec2 vTexCoord;

    varying vec4 vDirectDiffuse;
    varying vec4 vSpecular;
    
    void main()
    {
        lowp vec4 pixel;
        lowp vec4 ambient;

        pixel= texture2D( texture, vTexCoord );
        ambient = pixel * vColor;

        lowp vec4 diffuse = pixel * vDirectDiffuse;

        vec4 totalColor = clamp(ambient + diffuse + vSpecular,0.,1.);  
        totalColor.a=1.;
        gl_FragColor=totalColor;
    }
    
    ]])

Yes, by its nature, vertex based lighting will work best on high poly objects, and not on low poly objects with large flat triangles.

However, I think the way to make the most efficient lighting is to keep it as simple as possible (something I have read from a number of professionals).

For example, in my 3D dungeon, I have a simple diffuse light with a maximum radius, and a little flicker added for effect (basically, just wobbling the maximum radius). The reason this is so efficient is that I can set the camera perspective maximum distance equal to the light radius, which has a huge effect on speed, and I can also test whether my 3D objects in the dungeon are within the light radius before bothering to draw them.

So I think that optimising lighting isnt just about squeezing in all the lighting you can afford, but rather about leaving out everything you possibly can. This is also true of 3D modelling, where I’ve learned that game developers have all sorts of tricks for “cheating” and avoiding drawing complex 3D models.

I’m sure there will be cases where specular lighting makes a real difference, but if you’re making games, I think you probably can leave it out, because the user is not going to notice it, if you throw enough zombies at him.

A quick video of the shader in action. When the object is as large as this, the low-ish poly-count starts to become visible…

http://youtu.be/ltV_7j94ae8

nice anyway

I quite like it, low poly models somehow have a nice aesthetic feel, for me at least. I wouldn’t have expected the lighting to work this well on a low poly model, actually.

I made the surface of the shell look rough by adding noise to the vertices. But now I’m thinking it would look more lady-bird like with a smooth shell. Sometimes it’s just so hard to resist that noise button…

Is there any way that this can be modified to support many different light sources?

@KidKoder yes, you’d pass in a numberOfLightSources uniform, make the lightPosition uniform an array, and use a for loop in the fragment shader to loop through all the lights. It might impact performance though.