Rigging 3D models in Codea - First animation

My main frustration with 3D is not being able to animate 3D models, eg wave their arms around or walk. Programs like Blender do it by providing “armatures” (effectively bones) which you can put inside a 3D figure and then move around, and the program figures out how to distort your mesh. But you can’t export the armatures to anything Codea can use, and Codea wouldn’t know what to do with them anyway.

So because I like an insane challenge, I am building a rigging program in Codea. I’ve defined a number of body joints, and created “bones” that run between them. The model’s mesh will need to be broken into pieces, one for each bone. Then as the joints move, the body pieces and meshes will change direction along with them.

Each animation will require a number of frames, which means setting and storing x,y,z positions for each joint, for each frame. The playback program can then play back the frames, positioning and rotating the body meshes.

There is, of course, a lot of work setting positions for nearly 20 joints, for every frame, and making all the body parts move together. That’s why Blender has armatures and all sorts of helper functions. So I’ve created the best approximation I can.

My rigging program allows me to use Codea to drag joints in 3D, to position the body to create one frame of an animation, save the positions, and then do the next frame, just as you do in Blender. Codea can then play back the animation in any of my other 3D programs, to create the effect I want.

The hard part is keeping the body parts moving coherently, so if you move the foot, it doesn’t detach from the leg, or bend the leg the wrong way. I believe this is known as inverse kinematics, or IK. I’m including some IK in my program for arm and leg movements.

Initially, I’m just using a body without a mesh, and my progress is shown in the video below. Clearly, there is a lot of work to go, but I thought I’d share it for anyone who is interested.


The quake3 source code is open source. Why not take a look at how they handled the import/rendering of the model data and animations? then you can just export your model (with bones) and animation sequence from blender instead of having to animate it via some LUA.


edit: here’s the Doom (2004) source as well, since it’s newer than quake3, and also written in C++

Check out all the stuff labeled “model”

Here is my very first animation :slight_smile:


It uses 5 frames, which cover one stride. The jerkiness in rotation is purely because I am rotating manually. (You may see a few loose vertices floating free - it was very difficult to cut the limbs apart cleanly because the model had a lot of internal rigging vertices).

I’m getting something like 45 FPS.

it’s great. codea arrive in the 4th dimension. i imagine with red and yellow ironman’s colors.

@SkyTheCoder - I probably wouldn’t because it’s very hard, according to a book I’m using as a reference. You can do faces with vertex-based systems, or using a skeletal (bone) system.

I’m certainly not going near that in the near future!

@Ignatz What if you want to animate a character’s face? How would you do that using separate meshes?

@spacemonkey, @TechDojo - I’ve rejigged how I’m doing this. Instead of dragging joints around (which creates problems because I have to keep bones the same length), I am simply rotating them, as shown in the video below, and that works much better.


I think working with separate limb meshes is going to be the best approach in Codea, because it runs so much faster than vertex manipulation (I had a look at doing that, but apart from speed issues, it is extremely difficult, especially when your figure has an irregular shape).

I’m not using any K or IK now (unless you count flowing rotations down a limb, so when you rotate a shoulder, it also rotates the elbow and hand), because when you only have a couple of linked bones, it isn’t worth the effort. All I’m doing is restricting movement in certain directions, depending on the joint, so a knee joint can only rotate backwards.

The real work has been managing rotations, because I have so many different layers - the figure can be placed in a scene and rotated, and then it can be further rotated (as part of an animation), then individual body parts can be rotated as well. And of course, rotations to one body part may affect rotations for other body parts, which have their own rotations… (Did I mention I hate rotations?)

Additionally, I have 4 primary camera views (front, left, eight, top), with manual adjustments to those views, which create more rotations.

I am using quaternions for all of this, and while they are marvellous, it has been a nightmare sorting them all out.

I can store joint positions to create frames, so now I’m going to see if I can create and play back an animation.

I was actually working on my own rigging code a while ago, based on @spacemonkey’s rigging. You can check out his over here: http://codea.io/talk/discussion/2901/skeletal-animation-shader#Item_1

I only have a box that stretches right now, yours looks amazing! How did you make the 3D IK?

I’m still working on it, only just started with IK, which requires lots of trigonometry.

I see spacemonkey used a shader and weighted bone effects. I’m keeping it much simpler and using separate meshes, one per bone, and no shader.

This is how a lot of games were built until recently. That’s why characters had shoulderpads and big belts, to hide the not-quite-working joins at shoulders and hips!

Very nice!

I messed around with Unreal Development Kit for a little while, which used a lot of skeletal meshes like what you are creating. Thankfully the code didn’t have to be written from scratch like what you are doing.

Interested to see where you take this.

I’ve got my joint movements working (mostly) and I’ve carefully chosen a 3D model to work with (below). It is small (only 100 vertices), has clearly defined limbs and nice big shoulders and knees that I hope will cover any “tears” that happen when limbs bend.

I’ve carefully chopped the model into 12 separate meshes in Blender, and imported them into Codea.


I still have to position my joints inside this model, get the drawing to work with meshes instead of sticks, and then I can start creating animation frames.

That’s only a 100 vertices??? That’s a bloody good model for such a low vertex count, obviously the texturing makes a lot of difference.

I’m not sure if your handling lighting, but it would be interesting to see it untextured - just have the poly’s with some flat lighting.

Where did you find the model?

Sorry, 1100 vertices! :stuck_out_tongue:

I found it on a free site (tf3dm.com) with thousands of models, most already in obj format.

Ok - 1100 seems a little more reasonable :slight_smile:

In mine, (apart from the shader mojo) I set the bones up in a structure where they link to each other. This solves the foot detaching kind of scenarios as you make joint movement relative to the joint. So when you move the hip, it cascades down the bone tree to the knee and foot to keep them all synced up.

That’s done in the bone part in lua before it gets anywhere near the shader and a similar approach should work for you too.

Yep, I’m doing that too

I’m not using a shader at all, at the moment. I thought about using an “influencing” technique for handing limb movements, but it seems too slow, because it means adjusting all the vertices at every frame, which slowed my iPad 3 from 60FPS to 8FPS.




Yeah, that’s the point, if you modify the vertices each frame you are stuffed performance wise.

But you can either bind the “bones” to the vertices and dynamically shift the vertices in the vertex shader which was my approach, which with good weightings can smooth shoulder joints etc, or have each body segment it’s own mesh, and translate the mesh position by the bones and draw each mesh at the right place/rotation. And as you say, big shoulder pads hide all issues :wink: And that way the mesh draws are just normal mesh draws.

Sorry - I’m in a silly mood this afternoon! :slight_smile: