I have continued hacking the AR Face Demo. I am trying to draw a point cloud on the face based on walking the positions of the AR faceModel. I want it to track the face just like the textures in the demo do. Those textures stick just like the face grabbers in the Alien movies.
I can get frustratingly close. The triangle mesh I use wraps around nicely. My problem is that I can’t seem to set up the camera correctly and my point cloud drifts all around yet stays “kinda” close. The rotation does look good. I don’t see how to make my viewing setup jive with what is happening down in the Craft used in the demo. I am hoping there is an api that lets me get the setup values I need.
I am certain I am just suffering the confusion that goes with being a new user. I am hoping someone in the community can guide me.
Below is the “hopefully” revealing subset of my code. Also attached is a picture of how close I can get when I position my iPad by hand just right.
function draw()
scene:update(DeltaTime)
scene:draw(WIDTH, HEIGHT)
pushMatrix()
setupCamera()
drawFacePointCloud(mainEntity,triMesh)
popMatrix()
drawTarget(mainAnchor)
end
function setupCamera()
local cz = 0.15
perspective()
camera(0,0,cz,0,0,0,0,1,0)
end
function drawFacePointCloud(face, mesh)
if face == nil then return end
local wp = face.worldPosition
local wr = face.eulerAngles
local positions = face.model.positions
pushMatrix()
translate(wp.x,wp.y,wp.z)
rotate(wr.x,1,0,0)
rotate(wr.y,0,1,0)
rotate(wr.z,0,0,1)
mesh:setColors(color(255, 255, 0, 128))
for k,v in pairs(positions) do
local p = v
pushMatrix()
translate(p.x,p.y,p.z)
mesh:draw()
popMatrix()
end
popMatrix()
end