I am trying to implament this paper in threejs, and in order to do so I need to get the direction that the arrows are going at each pixel here in screen space. The arrows are just visuals, this isn’t going to rely on them. They go along the positive y axis of the UV coordinates. I am attempting to calculate a orientation map in the fragment shader so that I can feed it into the non-photorealistic post processor. Ignore the hall of mirrors hell going on in the background.
I have tried converting [0,1,0] from tangent space to world space and then projecting that onto the view plane, which did not work. I also tried getting the dF/dx and dF/dy of the uv coordinates, which also did not work.
(Small digression / request / unrelated comment - if possible, could you please share progress and the final result, when ready? The shader in whitepaper looks absolutely stunning. )
I absolutely will. Its not implemented in the exact same way as in the paper though, and the person who wrote the paper actually knew what they were doing. Currently, mine looks absolutely horrible because the orientations wont work.
I have not read the paper but do you need the direction in screen space or world space? If you need it in screen space the dF functions should do what you need:
I need the direction in screen space. Wouldn’t dFdx(vUv) return a vec2? This is just getting injected into the meshPhongMaterial shader source and then its outputted to a texture for orientation.
Why are you modifying the phong material for this instead of using a custom shader? The phong material will include lighting effects which will affect the result. You’ll probably need to render the scene in two passes: once for the “beauty” pass and once for the UVs / UV direction.
It gets injected into a loaded mesh. The custom shader comes later and processes a bunch of different textures storing information that comes out of the meshPhongMaterial. It is attached to a bunch of vertices in a pointMesh.
This doesn’t work and I don’t know why. The arrows do this weird swirl thing, and also don’t point in the right direction a lot of the time. Also, the arrows that are completely vertical or horizontal are manually overridden
You’re not providing enough information for me to help. I’m not sure how your model is structured, how the UVs are laid out, how the arrows are generated, etc. The issue could be in any one of those spots.
I would make sure the rendered UVs and the per pixel direction values are what you expect to see first.
What do you mean how my model is structured? If you mean like how it was made, I exported it from blender as a .obj after doing smart UV project and then loaded it using objloader. I don’t know what you mean by how the UVs are laid out, but the arrows get their direction from a framebuffer that I render the direction I want them to point to. Each arrow is a vertex in a pointMesh, and in the vertex shader for them the framebuffer is sampled at their location on the screen. Their rotation is then the xy component of the framebuffer. Then, I make a rotation matrix and multiply the coordinates they use to get their color in the strokeTexture by it. I have double checked that the arrow rotation code works, and they do point in the direction they are supposed to.
Is it possible that the arrows do the weird swirly thing because this is a perspective camera?
I don’t know what your model looks like or what the UVs on it look like. As far as I know that image could be correct. Depending on the UVs of the model you might get that “swirl” effect.
Without some example code and the model I can only guess where a problem might be to help understand the issue / debug.
This is the model from the same angle without the wacky overlaid arrows. These arrows point towards the positive UV direction. This thing is part of a larger project and the code is kind of interconnected, along with a bunch of junk that does unrelated things. What specific parts of my code should I provide? Like the shader source?
I’ve included a negated version of the UV gradients because negative UV gradients will be rendered as black otherwise. This is more or less what I expect – if you look at the floor the UV.y value is increasing down to the right which is what the colors in the uv gradient image indicate. Maybe check to make sure your renders look similar and check / share a demo of the arrow rendering code?
Here is the arrow stuff without the random extra junk. Make sure to replace the colorTexture with a framebuffer containing your scenes normal color, and normalTexture with a frameBuffer containing your orientations in the xy. strokeTexture needs to be a 256x256 image, like the attached ones. The arrows on the floor are twitichier then they normally are because normally i manually override their direction.
Thanks for the pen – I’ve messed around with it for a bit and I think I see what’s happening now but I don’t know how to get the behavior you want I believe what’s happening is as the triangles reach the edges of the field of view they get stretched and the and so the gradient is not consistent across the face in screen space (due to perspective camera like you’d suggested previously).
Perhaps the view ray can be used to account for that? I’m now very perplexed by this. I’ve tried looking around online for other OpenGL or DirectX implementations that do this sort of thing but I’ve had no luck. It definitely seems like it should be possible to get the face direction via shader…
I checked and it does have something to do with perspective. They don’t curl when using an orthographic camera, but they also don’t line up correctly and I don’t want to use a orthographic camera. How would I use a view ray to fix the curling?
I haven’t thoroughly thought it through. It seems like you might be able use the angle of the view vector to determine how much distortion there is at any given point but I’m not sure. I’ll have to mull on it more…