How to pass arrays of points to Frag shader

Hello clever people,

My top-down game features a number of obstacles, which take the form of a 2d irregular polygon (with probably 5-20 points). I can create a shader to draw this polygon solid with no problem, but I would like to create a system which allows me to draw them in a more interesting way. A simple example might be to draw a thick outline, or have a gradient from the center out, or create a soft bezier line for the points. All of this requires the fragment shader to have a knowledge of all of the outer points. How can I pass the outer points to my Frag shader? For example, this shader does why I’m trying to do: https://www.shadertoy.com/view/4ts3DB except that the points are hard coded.

The frustrating thing is that I can almost pass an array. Am I fundamentally misunderstanding something here?

Here is some simple code which the inline comments explain:

function setup()
    m = mesh()
    outerPoints = {
        vec2(0,20),
        vec2(1,300),
        vec2(170,350),
        vec2(300,265),
        vec2(200,40)
    }
    m:addRect(WIDTH/2, HEIGHT/2, WIDTH, HEIGHT)
    m.vertices = triangulate(outerPoints)
    m:setColors(255,255,255)
    m.shader = shader(MyShader.vsh,MyShader.fsh)
    
    -- The shader is set up correctly but trying to set the array in the shader withthe following line causes an error ("Table has mismatching length for vec2 array uniform")
    
    -- m.shader.pts = outerPoints
end

function draw()
    background(40, 40, 50)
    m:draw()
end

MyShader = { 
vsh = [[
uniform mat4 modelViewProjection;
attribute vec4 position;

void main()
{
    gl_Position = modelViewProjection * position;
}
]], 
fsh = [[
uniform highp vec2 pts[4];

void main(void) {
    highp vec2 point1 = pts[0];
    highp vec2 point2 = pts[1];

    // Crude debug: Use the point data to color the triangle
    gl_FragColor = vec4(point1.x, point2.x, 0.0, 1.0);
}
]]}

Thanks,

Josh

This is a strange error where tables get resized according to how many are actually used in the shader. So if you declare a shader of length 4, but only refer to 2 of those, the table length seems to get resized. The other point to note is that although GLES tables are indexed from 0, when declaring the length of the table, you still count from 1 (so your table is too short).ie the last position in a table of length 5 is 4 8-}

This version of your code runs without the “length mismatch” error:

-- Shader Error
function setup()
    m = mesh()
    outerPoints = {
        vec2(0,20),
        vec2(1,300),
        vec2(170,350),
        vec2(300,265),
        vec2(200,40)
    }
    m:addRect(WIDTH/2, HEIGHT/2, WIDTH, HEIGHT)
    m.vertices = triangulate(outerPoints)
    m:setColors(255,255,255)
    m.shader = shader(MyShader.vsh,MyShader.fsh)

    -- The shader is set up correctly but trying to set the array in the shader withthe following line causes an error ("Table has mismatching length for vec2 array uniform")

     m.shader.pts = outerPoints
end

function draw()
    background(40, 40, 50)
    m:draw()
end

MyShader = { 
vsh = [[
uniform mat4 modelViewProjection;
attribute vec4 position;

void main()
{
    gl_Position = modelViewProjection * position;
}
]], 
fsh = [[
uniform highp vec2 pts[5];

void main(void) {
    highp vec2 point0 = pts[0];
    highp vec2 point1 = pts[1];
   highp vec2 point2 = pts[2];
  highp vec2 point3 = pts[3];
    highp vec2 point4 = pts[4];
    // Crude debug: Use the point data to color the triangle
    gl_FragColor = vec4(point1.x, point2.x, 0.0, 1.0);
}
]]}


You, Sir, are a legend. Thank you.

Now for bonus points: is it somehow possible to pass an array of unknown size? i.e. a different size each time?

Cheers!

No, you can’t have arrays of unknown/variable size in GLES. You could try using a for loop in the frag shader, and just pass an integer of where you want the for loop to stop. I think that would avoid the length mismatch error.

A simple example might be to draw a thick outline, or have a gradient from the center out, or create a soft bezier line for the points. All of this requires the fragment shader to have a knowledge of all of the outer points.

Only the last of these three, the bezier, requires the frag shader to know all the points (actually, even for the bezier it only needs to know 3 or 4 of the points, depending on which algorithm you use for the curve).

For outlines, the frag shader needs to know whether the points are outer edge or not. This is how I draw the stroke on the rounded rectangles in Soda. I used the normal attribute to define whether or not a given point was on the edge of the shape (as this is otherwise unused in 2D drawing, and using a built-in attribute is slightly easier than a custom buffer).

For this approach to work, each edge needs to have an internal vertices. ie, if your shape is rectangular, it’d need to be made up of 4 triangles that meet in the middle (rather than the standard 2 triangle quad). ie, the edge can only be defined in relation to the non-edge (or, there is no outside without inside. The zen of coding).

A gradient is even easier, the frag shader just needs to know where the centre is. Have a look at my multi-step colour gradient shader: http://codea.io/talk/discussion/6772/multiple-step-colour-gradient-shader

Top answers =D>

And beziers are easy with a vertex shader …

I guess if you were doing a curve on the fragment shader it would need to be an algorithm that stays inside the outline of the shape (ie some of the shape’s points would be control points which the line wouldn’t go through). @LoopSpace with a curve on the vertex shader, you’d implement that with lots of extra vertices (like the corners on your rounded rectangle?)

@yojimbo2000 See https://gist.github.com/loopspace/5611538#file-bezier-lua for a bezier shader. With the default of 150 steps, you can put a lot of beziers on the screen before you see any sort of a slow down.

@LoopSpace thanks for that!

This post on line drawing techniques in Open GL is really interesting:
http://mattdesl.svbtle.com/drawing-lines-is-hard