Texture perspective projection

As part of a larger problem I’m trying to solve, I need to project a texture from the camera plane to each of the geometry faces of an object.

The way I’m, currently, approaching this is by setting the uv of each vertex in the geometry to the xy position on clip space. This works as intended when the face normal is aligned with the camera’s focal axis, but the projection breaks as those are no longer aligned.

I’m not entirely sure why this happens, I would greatly appreciate any help.

perspective_projection

live example

notes: I’m aware that the live code provided suffers from a lot of problems. Namely using Geometry instead of BufferGeometry, and it’s also unnecessarily re-calculating vertices projections. I understand those problems exist, the live example was just meant to illustrate the behavior :slight_smile:

I believe the same effect can be easily achieved making use of the stencil buffer, but I need to actually re-calculate the uvs for use in a later stage. So, I can’t make use of stencil rendering techniques :frowning:

Are you trying to do “uv mapping” ? What i see in your sketch seems like a simple (or maybe skewed) perspective projection. There’s not much to do there except to call render() with THREE.PerspectiveCamera.

There’s not much to do there except to call render() with THREE.PerspectiveCamera

I’m not sure I follow what you mean here.

Are you trying to do “uv mapping” ?

Yeah, I’m trying to uv map the visible faces of the object according to their pixel positions on the screen. Which means that I’m basically mapping the pixel clip coordinates of each vertex to the uv positions on the texture.

Which, in theory, would show a non distorted ( or skewed as you said ) texture on the surface of the object.

But what I’m getting instead is a skewed projection on the object, whenever the faces normals aren’t aligned with the camera’s direction.

If you make a cube and put a camera in front of it and render to a texture, you’ve projected 2 triangles according to how i understand your description. What do you expect to happen when you rotate the cube by 45 degrees around some axis?

Maybe I did a bad job explaining what I wanted in the original post.

There are a few ways of thinking about what I want achieve here. I’m gonna try explaining it two ways:

  • Imagine you render a texture to the full viewport, afterwards you render your scene to the screen again, but only color the pixels where your object is not visible.
    What you end up with, is your regular scene, but where you would’ve have your object drawn, you have a silhouette of this object and inside it you can see the texture previously rendered.

  • A second way of looking at it is the following, imagine you have two layers on any image editing software. The bottom layer contains your texture, on the top layer you have your scene rendered. Imagine you simply select the pixels of the top layer that contains the object and you delete it. You end up with the same result.

So it doesn’t matter if the object is rotated, if the object is a sphere or a plane. The silhouette of the object will be rendered as a texture. But the key point is that I need to do this by uv mapping, not other approaches.

Hopefully that clarifies.

Have you seen this PR? Is it doing something similar to what you want, using a light’s perspective rather than the cameras? It’s doing the work in the shader, but that’s probably the best approach for implementing new textures mapping functions.

Light projectors

BTW, is this for the LWOLoader?

1 Like

BTW, is this for the LWOLoader ?

No, not really. It’s part of the texture painting project that I’ve been re-thinking. I wanted to move away from raycasting and this idea just popped in my head. It’s all working great, except that I can’t seem to get this projection right :frowning:

Have you seen this PR?

No I haven’t, this looks promising :slight_smile:

Maybe this will be helpful too: Texture Projection

2 Likes

I’m not quite sure if i understand what you mean, do you mean like this? https://jsfiddle.net/j4hbq2ek/

3 Likes

This can easily be solved with shader injection. Needs to be provided the perspective matrix along with the view matrix of some perspective camera. Value written to a varying, and then that varying used to read a texture. Be it some built in, or some other. Unfortunately, the ways to do this are very verbose :frowning:

But yeah, my understanding is that, unlike a projector, the op wants to actually just render a texture in screen space, in whatever geometry is rendered. Imagine if you could look up the AO map, instead of composing it in an additional step.

This could also be achieved with a suuuuper simple shader material, but I’m on my phone right now.

The problem (I think) is “how to show a texture on an object using its screen coordinates and not the uvs”.

The fiddle i linked did just that. I’m not sure though if this is exactly what he wants.

1 Like

Yeah didn’t load on my phone at first, exactly that.

1 Like

I’m still not completely sure I understand what you want here.
Does @Fyrestar’s fiddle solve your problem?

If not, do you mean Planar UV Mapping?

GUID-5D699E6C-4B3F-4136-9C5C-C696DB4D6F98

1 Like

@prisoner849 Thank you. That looks really similar to what I’m looking for.

@Fyrestar The end result is exactly what I’m looking for, but I wanted to achieve the same result by modifying the vertex UV coordinates attributes we send to the renderer. Not modying the shader mapping itself, theoretically I’m doing the same thing, but in reality I’m not. Which is creating this skewed mapping on my live example.

@looeee Maybe I’m not even sure myself :rofl:. The difference is that a planar UV mapping projects orthogonally to the mesh, in my case I want to project perspectively. If that makes sense…

I guess I have enough material to study and try to figure out how to solve my problem. I’ll have to take a look at the resources and see if I can come up with a solution for my problem.
If I find a solution, I’ll make sure to create a reply.

Thank you all :heart:

3 Likes

Ok, thanks to prisoner’s example I understand what’s wrong here.

The problem is that, what I’m looking for appears to be impossible. Even though the UVs map to the correct screen position. The relationship between each frag pixel to the uv vertices is non-linear, so when the UV coord is linearly interpolated to the fragment shader, it breaks the projection :unamused:

Welp, I guess I’m gonna look for a different approach to my problem.

Thanks again guys.

1 Like

I found this relevant Article by Nathan Reed discussing the problem I’ve ran into. I’ve already solved my problem by modifying my approach to the problem, but I feel like this could be useful for future reference.

2 Likes

three-projected-material :point_down: :eyes:

Hi
Did you find any good way?