Projecting texture on poly

Hello all, I’ve been playing around with codea again and I was wondering if we can project a texture (from the view) on a polygon that is rotated and positioned in 3d space? And then stick it to the poly so that it moves when I move the polygon?

What do you mean by “from the view”? Do you mean part of the current screen?

Sorry, no i mean over the z/depth axis of the view or camera. I believe frontal mapping is the correct term.

So i have a polygon positioned and rotated somewhere in 3d space and now i want to project an image onto it, as if using a projector, so the image would not be deformed due to perspective.

Then I want to stick it to the poly, so that it does deform due to perspective when the camera or the poly changes position or rotation.

you could do that with a shader where the texCoords are derived from the 2d position on screen. EDIT. I read too quickly. What I’m talking about would only achieve the first part, the projection. Making it “stick” would be tricky. You’re after an effect like the third test (the leap of faith) at the end of Indiana Jones and the Last Crusade, I think B-) . Does Blender have a frontal mapping feature? Might be easier to calculate the texCoords there and import.

Yes, though I am using it for some matte painting (2.5D) stuff, and this would be great for walls streching into the distance. Love the indiana jones example, I think that must have been my first ever encounter with this kind of trick. :slight_smile:

I was hoping maybe codea would have some handy features to help with the mapping, or some clever combination of 3d translations and deriving uvw offsets (if those are adjustable) from there.

I don’t like to use 3d software because I’m trying to make it as self contained as possible, so that I only have to load in the images and their depths. Ideally I could calculate everything in the app itself.

I think we need more details about what you’d use this for. I guess the steps are:

  • work out where in 2d screen space the 3d points are (quite a few posts on that topic on the forum, also I think CoolCodea has some blog posts on this)
  • derive and set the texCoords from these

I agree with yojimbo2000 - this post of mine might help

If you don’t want any distortion from OpenGL’s texture mapping interpolation, you could try using a shader to calculate the texture coordinate for each fragment using a uniform mat4 model * view * projection matrix from the projector’s perspective (only needs to be calculated and set once). I did something similar in my shadow mapping project, where I calculated the texture coordinate of a fragment on the shadow map from the light’s perspective.