My Grass simulation with vertex shaders

Hi!

For a project I needed to simulate someone walking on a grass field. I gave a try to prototype it with codea before the 1.5 but it was very slow and only 2d.
With the new version with shaders inside, it was the perfect project to start learning shaders, and I’m loving the result.

Here is my WIP result (it’s pure vertex shader, I will work on the fragment shader later) on my ipad 2.

Dang but that’s cool. Good work!

I could swear I saw gollum stick his head up once

Nice work

That’s amazing… Nice work.

That is just… wow… any chance of some sharing of the code?

Woah!!! This is awesome!! Well done!

Hi @Bendiben,

Awesome demo, also love the the inverted made with Codea in the top left hand corner of the shader!!!

Thanks

Bri_G

Hi all, thanks for your feedbacks.
@Spacemonkey : I cannot share the source as it is related to a professional project but I like the idea to share the process (and appart of that my code is really messy as I learn programming by myself and it’s a mix of french and english…).

The shader is a pure vertex shader that rely a lot on attributes,the fragement shader just return the vColor. (I learned attribute and buffer with the undulate code released by mpilgrem on this forum)

In the setup function, every blade of grass (composed of 3 triangles) is randomely place on the floor. At this step, The blade are totaly straight.
For the attributes, I add a weight that will be used to calculate the final position.
The bottom vertex have weight of zero so the don’t move, the top a weight of 1 will fully move and the trick is to use for the mid point a weight below 0.5 (let’s say 0.4) so it will automatically bend the blade when applying a displacement.

In the shader, the displacement is the sum of three “forces”:
*The noise to shuffle a litte bit the blades
*The wind
*And the “foot step”

All forces are only x/y. The Z is calculated later.

For the noise:
This part rely on textures.
As vertex shader cannot access to texture (only fragment shader can) I had to set another attribute (a vec2) in the lua code to read a image (made with photoshop) that correspond to x/y displacement. Actually I set 4 different attributes that I can interpolate in the shader to make this noise move.

For the wind:
It’s only set in the shader, it’s pure math formula made with sin function. I wanted to make the blade bounce so sinus is perfect. You have to make attenuation over distance, also the frequency as to be higher over distance. So the function is like (1/distdist) sin (dist*(1+dist))

For the “foot step”:
It’s also pure math:
I calculate a distance from a center point (uniforn vec2)
If distance< radius (uniform float) then
force will be in the direction of the vector
the amplitude will be greater in the center and nul at radius :
amplitude is proportionnal to 1-p*p

Last step is to sum the forces, clamp the sum if necessary( too big) and calculate the new z of the vertex according to length conservation (pythagore).

For the color, I set up darker color at the bottom and lighter at the top. For the ground it’ a quad with a texture. As I know where the blades are, I put a little circle at the base of the blade to simulate shadow/ambiant occlusion.

That the global process.

I have a lot of improvement in mind, I will post further steps.

There is something that you cannot see in the video: the “foot step” is not located under my finger. I use currentTouch.xy (screen space) that I put in the ground space (uv coords).
As there is a perspective and camera rotation they are never aligned,
Does any one know how to make true 3d picking?

happy coding!

Great description of the process. Thanks!

.@Bendiben that’s brilliant, thanks almost better than code, now I have a puzzle to solve

As to 3d picking, this is a problem I’ve been thinking about and not yet solved… one idea I’ve had is to do a render of the scene to an image but render color to that image based on the object or location, then I should be able to do a color lookup based on touch which should tell me the object I picked. There are 2 problems with this, first, you can’t render the same mesh with 2 shaders, so you have to duplicate the meshes, also, renders to image don’t obey Z order.

However, in your projects context you don’t need to double render the grass, you could render just a square upon which the grass sits, and the color of the points in that render represent the ground coordinates, then it should just be a get pixel color from touch and you know which point on the ground you touched…

This is my first attempt at it, using an article I found as inspiration also. Really ugly grass texture and no lightning yet, but things move at least

http://http.developer.nvidia.com/GPUGems/gpugems_ch07.html

``````
--# Main
-- Grass

function setup()
displayMode(FULLSCREEN)
R,H = 40,1
base = mesh()
base.vertices = {
vec3(-1,0,-1), vec3(1,0,-1),
vec3(1,0,1), vec3(1,0,1),
vec3(-1,0,1), vec3(-1,0,-1)
}
base:setColors(color(46, 106, 61, 255))
g = Grass()
end

function draw()
background(0, 0, 0, 255)
camera(0,H,5,0,0,0)
perspective(45)
rotate(R, 0,1,0)
g:draw()
scale(4)
base:draw()
end

function touched(touch)
if touch.state == MOVING then
R = R + touch.deltaX*.5
H = math.max(H - touch.deltaY*.02, .5)
end
end
--# Grass
Grass = class()

function mult(m, v)
return vec3(
m[1]*v.x+m[2]*v.y+m[3]*v.z,
m[5]*v.x+m[6]*v.y+m[7]*v.z,
m[9]*v.x+m[10]*v.y+m[11]*v.z
)
end

function Grass:init(x)
self.m = mesh()
http.request("https://dl.dropbox.com/s/2kboo35minb8opx/Photo%202013-03-19%2010%2011%2028.jpg?token_hash=AAFsT-JmLRpPP5F0pet5tZHo2c0-aaOu7LpwzHCWXVPS_g&dl=1",function(data)
self.m.texture = data
end)

local sq = {
vec3(0,0,0), vec3(1,0,0),
vec3(1,1,0), vec3(1,1,0),
vec3(0,1,0), vec3(0,0,0)
}
local vs, uvs = {}, {}
for x=-5,5 do for z=-5,5 do
local r = vec3(math.random(-.5,.5), 0, math.random(-.5,.5))
for j = 0,2 do
local m = matrix():rotate(60*j,0,1,0)
for i,v in ipairs(sq) do
local p = v - vec3(.5,0,0)
p = mult(m,p) + vec3(x,0,z)*.7 + r
table.insert(vs, p)
table.insert(uvs, v)
end
end
end end
self.m.vertices = vs
self.m.texCoords = uvs
self.m:setColors(255,255,255,255)
end

function Grass:draw()
self.m:draw()
end

uniform mat4 modelViewProjection;
attribute vec4 position;
attribute vec4 color;
attribute vec2 texCoord;

varying lowp vec4 vColor;
varying highp vec2 vTexCoord;
uniform highp float time;

// Noise
float hash( float n ) {
return fract(sin(n)*43758.5453);
}
float noise( in vec2 x ) {
vec2 p = floor(x);
vec2 f = fract(x);
f = f*f*(3.0-2.0*f);
float n = p.x + p.y*57.0;
float res = mix(mix( hash(n+  0.0), hash(n+  1.0),f.x), mix( hash(n+ 57.0), hash(n+ 58.0),f.x),f.y);
return res;
}
float fbm( vec2 p ) {
float f = 0.0;
f += 0.50000*noise( p ); p = p*2.02;
//       f += 0.25000*noise( p ); p = p*2.03;
//      f += 0.12500*noise( p ); p = p*2.01;
//      f += 0.06250*noise( p ); p = p*2.04;
f += 0.03125*noise( p );
return f/0.984375;
}

void main()
{
vColor = color;
vTexCoord = vec2(texCoord.x, texCoord.y);
vec4 p =  modelViewProjection * position;
if(vTexCoord.y > .9) {
float n = fbm(p.xy*time*.2)*4. - 2.;
p = p + vec4(n,0.,0.,0.);
}
lowp vec4 nor = normalize(p);
gl_Position = p;
}

]]
uniform lowp sampler2D texture;
varying lowp vec4 vColor;
varying highp vec2 vTexCoord;

void main()
{
lowp vec4 col = texture2D( texture, vTexCoord );
gl_FragColor = col;
}
]]
``````

.@tnlogy interesting, I found that article as well, it’s quite a different style of grass simulation, more of the kind of graphical grass in a wider environment rather than simulating the actual grass as @Bendiben did. I think something you could do is play with randomising the positions of the grass a bit, it comes out very regular.

Yes, especially when you watch it from a height. You can change the range of the randomizer value r in the loop to get it more distributed. Putting them more closely together and adding better lightning would also make it more realistic. I also think my noise function is a bit wierd, should be able to modify so that is looks more like a wind, not just random movements.

.@Bendiben for noise I have a snoise function (off the web) in http://twolivesleft.com/Codea/Talk/discussion/2348/shader-old-film-effect#Item_1 also a quick google finds a bunch of in shader noise generation bits of code, they rely on just passing in a uniform as a seed to make it animate.

This looks good for some options and samples too http://stackoverflow.com/questions/4200224/random-noise-functions-for-glsl

I started working on my own implementation this lunchtime, this is fun stuff. I’ve got it rendering a field of grass, 10000 blades at 60fps. It’s got deformation working, now I just need to implement the actual cause of deformation, ie wind, touch and noise.

Mine is uglier than yours but that’s potentially scale, color and no noise issues.

Nice, would be fun to see your code :). I think the noise function I’ve snatched from the web is quite ok, but I think I use incorrectly. Here is an older video with the cloud generation that I used it for originally. The one I use is a bit slow, but can be “optimized” by making the pattern less dense, less calls in the fbm function.

http://youtu.be/IqRNT23v8iY?t=33s

It’s so easy to get distracted by other peoples code and videos out of curiosity

Btw, the world is small since Stefan Gustavson’s who written the paper about simplex noise is at the university about 200 meters from my home here in NorrkÃ¶ping.

https://github.com/ashima/webgl-noise

OK, here’s a first cut as I won’t be touching this for a few days… the touch is totally wrong, but at least your finger can cause some pressure. It’s ugly. But it has noise… interestingly I think the youtube is less ugly than the reality…

Next steps: - pressure mechanism to mutlitouch and I was thinking of a similar thing for wind as well as circles… - lighting, I started trying to think this through but surface normals can’t be precalced like I do elsewhere because the grass gets deformed, I guess I need to apply similar deformation to the normals, but right this moment it’s beyond me.

Code here:
https://gist.github.com/sp4cemonkey/5208579 (some people have had issues with html encoding from gists, if in doubt try raw etc…)

@Spacemonkey Nice!
I have quite the same performance as your, bottleneck seems to be the number of poly, not the shader.
I agree with tnlogy, there a lot of improvement to do!
Some are easy ( density map, color map, height map) some need more times ( improving shadows, fragment shader with texture and lightning, fake normal calculation according to bending…) and some seems really tricky to me: making more realistic physics, that means bounces and trails. As we cannot loop the output of the vertex shader into it ( we can but it’s very slow) it seems impossible. I’m focusing on this part, finding the magic trick. It should be ok with the bounces but keeping a trail is harder ( codea could feed the shader with a uniform array of points/radius but this might be very slow…)

Its really sad that texture lookup is not allowed for vertex shaders…

I will post another vertex shader soon ( not related to grass), still need some adjustments.

Thanks for the code, interesting! Seems to work nice to randomly place the grass instead of looping in a grid like me. Is the buffer object resized automatically? I’ve done that manually before.

Yes, as long as the attribute is bound before you resize the mesh then the attributes are sized appropriately as well. The other nice thing is once I’ve got the buffer bound to a variable I don’t have to rebind it everytime I resize.

Also, a nice one to play within the vertex shader in the snoise function is the last line:

``````return 30.0 * dot(m, g);
``````

adjusting the 30.0 to a bigger number makes the noise more visible. I apply the noise twice once in the x direction and once in the z dimension so things wobble both ways, not just back and forth which is nice.