hacking the AR Face demo

https://youtu.be/tEnuHaiTgrs

I hacked the AR Face demo to use my face as an input device using the anchor.lookAtPoint and the eyeWide of the blendShapes to move a circle around the display and change the color when I made my eyes wide enough.

local lp = anchor.lookAtPoint * eyeAdjust
lookpt = vec3(lp.x * -900000.0,lp.y * -900000.0, lp.z)
ewide = anchor.blendShapes.eyeWide_L

It is very very very crude but heck, only 5 minutes of hack time.

I think this will be an interesting frontier. You can imagine AR apps in the future where you are looking through the magic iPad window and there are entities in the room that come alive when you attend to them. They may even be sent your emotional state information from some model based on using the the face.

see:

https://www.ted.com/talks/rana_el_kaliouby_this_app_knows_how_you_feel_from_the_look_on_your_face?language=en

Hah that’s really impressive!

I wonder about more subtle interactions using eye tracking. Stuff like not making the display go to sleep if you’re looking at it. Pre-caching a link on a webpage because you looked at it, so when you click it it’s already loaded.

Instead of bricks or the other materials in the AR Face demo I would like to show yellow dots or wire frame on my face that match the positions in the face model but can’t figure out how to make a Material or Shader in Shade that would do that.

Attached is a picture that shows the dots on my forehead. Any ideas are welcome.

@DavidLeibs Check out the wireframe shader that @John recently added as a Shade example.

I had tried that. By itself it gives the wrong result. See picture.

I don’t know to tweek it either. It seems like material.color isn’t there since it isn’t like other Materials

@iam3o5am @John
I got it much closer by finding a place to add an entity.model:split()
Now I just need to figure out how to make the color clear so my face will show through while leaving black wires.

@DavidLeibs be careful with model:split() as face geometry assumes the same vertex indices every frame, which will be altered by that method and may not work how you think it should.

@John, you are right, I am having to do the model:split() on each call to Face:updateWithAnchor. This did make me feel very unclean since I don’t have a clue what is happening behind all the gizmology. It appears to work in that the Wireframe shader looks right and appearears to track my facial gyrations. I suppose I should just back out using the entity to do what I want and try to use lower level GL.

I do really like working with retained scene graphs so it would probably be a nice feature if you could configure select entities to draw with a wire frame or point cloud.

If it works it works. I’ll check again to see if I made a faulty assumption about it :blush: