Explosion shader (code & video)

I’m having a very strange issue.

I’m working on an explosion shader. It’s inspired by @Andrew_Stacey 's 2D explosion shader here,


but it’s for 3D models. It disintegrates each of the faces along a trajectory derived from the face normal (unlike Andrew’s, I assume that the model is already multi-faceted, I don’t further subdivide the faces).

Now, the weird thing is, the shader was actually coming along nicely within a large project I’m working on, when I decided to export it to a minimal working example (MWE) to work on it further (and to share it). And, ironically enough, I can’t get the custom attribute buffer to work at all in the MWE. In the parent code, it works about 70% of the time, but does sometimes produce this error.

It’s probably something really simple, and I’ve just been staring at it too long, but I can’t work out what I’m doing wrong at all.

The bit that triggers the error is when I try to index the custom buffers. When this code runs:

    local origin = m:buffer("origin")
    local trajectory = m:buffer("trajectory")
    origin:resize(#verts) --shouldn't be necessary. Triggers error.
    for i=1, #verts do
        origin[i] = ori[i] --triggers error

I get:

error: [string “explode = {}…”]:9: attempt to index a nil value (local ‘origin’)

The mesh has been set up, it has vertices, colours, normals (so the custom buffers shouldn’t need resizing), it has a shader that compiled without error, which uses the custom attribute.

What have I forgotten??

The full MWE is here (it borrows code from @Ignatz , @Andrew_Stacey , and whoever made the Icosphere function):


I’d be very grateful if someone could put me out of my misery.

I’ve figured it out. It was an issue in the plumbing between the vertex and the fragment shader. The latter was expecting a variable vNormal but the former was supplying one called vDirectDiffuse instead. Weird though that it results in an error in a different attribute. I’ll post the working version soon, as there’s some other elements I’d like help with.

Here’s a video. Code is on it’s way. Does the resolution of the video look really terrible to you? For some reason the stuff I upload to YouTube always ends up looking awful.


I was just about to post the solution … but you found it yourself.

As the video quality is so crappy, here’s an image:

bang bang

The gist link above has been updated with functioning code. The next challenge… gravity (I’ll have to refer back to Andrew’s 2d shader I think…)

@yojimbo2000 - looking good!


I’ve updated the code at the gist again so that it now has gravity. The gravity and friction code is adapted again from Andrew’s shader. One tricky thing is working out which way is down as the model is rotating. To work out what the world gravity vector is I load the inverse of the model matrix into the shader. Here’s the updated vertex shader. Suggestions and criticisms welcome.

explodeVert=    [[

uniform mat4 modelViewProjection;
uniform mat4 modelMatrix;
uniform mat4 modelInverse; //inverse of the model matrix
uniform vec4 eye; // -- position of camera (x,y,z,1)
//uniform vec4 light; //--directional light direction (x,y,z,0)
uniform float fogRadius;
uniform vec4 lightColor; //--directional light colour
uniform float time;// animate explosion
//uniform bool hasTexture;
const vec4 grav = vec4(0.,0.,-0.02,0.);
const float friction = 0.02;

attribute vec4 position;
attribute vec4 color;
attribute vec2 texCoord;
attribute vec3 normal;
attribute vec4 origin; //centre of each face
attribute vec4 trajectory; // trajectory + w = angular velocity

varying lowp vec4 vColor;
varying float dist;
varying highp vec2 vTexCoord;
varying vec4 vNormal;

void main()
    float angle = time * trajectory.w;
    float angCos = cos(angle);
    float angSin = sin(angle);
    lowp mat2 rotMat = mat2(angCos, angSin, -angSin, angCos); 
    vec3 normRot = normal;
    normRot.xy = rotMat * normRot.xy;

    vNormal = normalize(modelMatrix * vec4( normRot, 0.0 ));
   // vDirectDiffuse = lightColor * max( 0.0, dot( norm, light )); // brightness of diffuse light
    vec4 gravity = modelInverse * grav; //convert world gravity vector into model coords
    highp vec4 A = gravity/(friction*friction) - vec4(trajectory.xyz, 0.)/friction;
    highp vec4 B = origin - A;

    vec4 pos = position - origin; // convert to local
    pos.xy = rotMat * pos.xy; // rotate
    pos += exp(-time*friction)*A + B + time * gravity/friction;

    vec4 vPosition = modelMatrix * pos;
    dist = clamp(1.0-distance(vPosition.xyz, eye.xyz)/fogRadius+0.1, 0.0, 1.1); //(vPosition.y-eye.y)
    vColor = color;
    vTexCoord = texCoord;
    gl_Position = modelViewProjection * pos;


Sorry code above is bit of a mess, transitioning from goraud (per vertex) to phong (per fragment) lighting, that’s why there’s bits of unused lighting code still clogging it up. That’s what got me into trouble in the first place.

I suppose really now that each face is double-sided I should check for non front-facing fragments in the frag shader and invert the normal…

I’m not sure back-face normal inverting makes any difference, as far as I can see. I’ve read that this is the “correct” way to do double-sided faces. Has anyone tried this?

Change these lines in the fragment shader :

    vec4 norm = vNormal;
    if (! gl_FrontFacing) norm = -vNormal;
    vec4 vDirectDiffuse = lightColor * max( 0.0, dot( norm, light )); // brightness of diffuse light

Wrt gravity, it might be a little simpler to pass the shader a uniform of the vector (0,-1,0) multiplied by modelMatrix, to tell it which way is down.

Yes, that’s a good idea, thank you. Rather than do multiplying by the inverse modelmatrix on every vertex, I can just do that calculation once, and pass the result to the shader.

I should also point out, if anyone’s interested in using the code in their own projects, that it’s for a Z-up orientation (I just like working in Z-up). This means that because I’m only bothering to do a 2D rotation (as the angle computation is much easier, and I was adapting a 2D source), each fragment rotates around the model’s up axis (so the fragments appear to change size, and catch the light as they rotate). I think it looks good, and I’m not going to bother adding code to compute rotations around the other axes of each fragment.

Ok, trying to implement @Ignatz 's suggestion above, and I’ve got it working, but something rather expected has happened:

    m.shader.gravity = modelMatrix():inverse() * vec4(0,0,0,-0.05) --why is the gravity vector on w?

For some reason, when doing this calculation outside of the shader, the down vector is on w, not z. Why?? This is replacing these lines:

    --in draw
     m.shader.modelInverse = modelMatrix():inverse() 
// in the vert shader
    const vec4 grav = vec4(0.,0.,-0.05,0.); //down on the z axis
    vec4 gravity = modelInverse * grav; //convert world gravity vector into model coords

How come down shifts from z to w when the calculation is moved to outside the shader?

@yojimbo2000 - I guess one reason is that this doesn’t work

v = modelMatrix() * vec(1,1,1,1)
--test if we can reverse it
m = modelMatrix():inverse()
v2 = m * v  --< (0.94, 1.20, 0.74, 0) --doesn't reverse!

I’m no matrix expert, so I have no immediate explanation.

However, I think a better approach is this

  1. pass modelMatrix to the vertex shader
  2. apply it to the current position attribute to get the world position
  3. compare that with your gravity vector (0,0,-0.05)

This is a little more work for the shader, but it should work, and also avoids inverting the model matrix, something our resident mathematician says is a Very.Bad.Idea (possibly because not all matrices will invert).

  1. pass modelMatrix to the vertex shader
  1. apply it to the current position attribute to get the world position
  2. compare that with your gravity vector (0,0,-0.05)

with that approach you’d then also need to create your own viewProjection matrix to use instead of the built-in modelViewProjection. Something like this perhaps:

vec4 worldPos = modelMatrix * pos;
// do gravity calculations in world space
gl_FragPosition = viewProjection * worldPos;

I guess one reason is that this doesn’t work

Damn. So how do you translate a world coordinate into local space? What does your getLocalPoint function look like?

@yojimbo2000 - Based on previous Codea discussions, converting from world space to object space is done like this

--in Codea

//in the vertex shader  
uniform mat4 mInvModel;

//in main, convert vector v to object space  
v = v * mInvModel;

Maybe OpenGL’s matrix multiplication works different than Codea’s? In the test below I get much better results with modelMatrix:inverse() than you reported above. The result is within a rounding error. But if I change it to modelMatrix:inverse():transpose() the result is totally wrong.

-- ModelMatrix inverse as getLocalPoint
-- reasonably accurate
function setup()
    print ("test of using the inverse of the model matrix as a get local point \
tap screen to create a new random transform")
    --   parameter.watch("worldToLocal")

function randomTransform()
    x,y,z = rand(), rand(), rand()
    a,b,c = rand(720), rand(720),rand(720)
    u,v,w = rand(), rand(), rand()

function rand(range)
    local range = range or 2000
    return (math.random()-0.5)*range

function draw()
    background(40, 40, 50)
    localCoord = vec3(u,v,w)
    mat = modelMatrix()
    worldToLocal = mat:inverse() -- :transpose() adding transpose makes the results really bad
    worldCoord = mat * localCoord
    localBackFromWorld = worldToLocal * worldCoord
    closeness = localBackFromWorld - localCoord

function touched(t)
    if t.state==BEGAN then randomTransform() end

Yes, when trying to reverse modelMatrix in Codea, I probably should have put the vector on the left, which Codea doesn’t support.

I haven’t done much conversion from world to object, so I’m not sure.

Interestingly, inverting modelMatrix transposes the rotation part and makes the translation part negative…