How do I reference the current AR scene (camera image) as a texture, for example from a mesh?

The usual mesh.texture = CAMERA does not appear to do the right thing, it appears to suspend or stop the AR session.
Is there maybe an undocumented property on one of the involved objects (ar.scene maybe) that I’m not aware of?

Additionally, how would I get the dimensions of that texture (in order to rescale projected world coordinates to texture u,v)?

Best,
— Andreas

@andi just looking into this, there’s always some difficulties getting AR working with the standard camera API

Thanks for looking into it! John must’ve encountered and solved this issue at some point:
https://mobile.twitter.com/johntwolives/status/1065945915628773376?s=12

Also, in ARKit the equivalent object I’d be interested in is part of ARSession.currentFrame

Hi @andi for that example I created an image the size of the screen and used setContext() to draw the scene to it. I then drew that image to the screen using sprite() which left me free to use the AR camera image for anything else I wanted while still displaying as normal. If you don’t actually need to draw the scene so you can just use the image by itself.

Something to keep in mind is that the data provided by ARFrame is actually two textures that are in YUV color space that must be combined together and converted to RGB, which is what scene.ar is doing in the background. So that data is not really useful on it’s own.

Thanks, John, I’ll play with that approach.