Save calculated fragment/pixel shader as texture

I have a 3D model with a custom shader material applied onto it, now i wanna save the fragment shader as a texture, how can i actually retrieve the texture bytes or whatever from the custom pixel shader applied on my model so i can save it? I heard i had to use framebuffer or something.

Thanks in advance

Create a separate Scene object and add a 1x1 plane and a camera there. Add your shader material as a material of the plane. Create a new RenderTarget and assign it to your renderer. Then render a single frame of the scene with plane and camera (using that camera.)

You should be able to find your material-as-a-texture in ‘renderTarget.texture’ afterwards.

Huge thanks for your fast reply, unfortunately i modified my uv coordinates based on my geometry so i can’t apply the same material to the plane as it will not produce the same pixels as on the 3D model. is there any other way to do it?

UV coordinates are a property of a geometry - not of a material. Modifying them shouldn’t matter :thinking:

I apologize, i explained myself wrongly, i meant that i’ve modified my fragment shader to be a triplanar projection shader, the thing i want to do is actually retrieve the calculated texture on the 3d model as a 2d texture (kinda same as substance painter by adobe). Since it’s geometry dependant you can’t apply this shader on a plane as it will not produce the same effect

anyone got another idea?

Well, in order to store shader computation result as an image you will defenitelly need a ‘render to texture’ mechanism. There is a dedicated example at the official repo, pretty much the same steps @mjurczyk is describing.

I also get it when you say that you need the result for that geometry. Please consider that for asigning the resulting texture to a mesh in the future you will need a UV map - kind of pointless for implementing triplanar projection in the first place.

If you happen to set a UV map then you could try (for fun mostly) copy & paste (again, renderToTexture here) each UV tris in a loop, locating a orienting the camera perpendicular to the corresponding surface, and rendering to a defined renderTarget, ultimately filling in a resulting texture, ready to be saved. UV coordinates to mesh coordinates is the real challenge here, the remaining should be straightforward.

This should do the job for most of lambertian models but not for view-dependant surface phenomena (i.e. phong reflection properties)

1 Like

Yeah i already have a UV map on my model, ready to be projected on a texture, i’ll try your way and i’ll let you know if i encounter some problems, thanks for your response!

well the real problem with this method is that i can’t isolate the single polygon

actually i just thinked of a way, i could unwrap my 3d model exactly as my uv then store in them the original orientation with vertex color, process the triplanar with the vertex color instead of the actual position, then simply take a render of that with a camera

Update: the idea was really good, the problem is the vertex color, does only support a limited number of decimals, therefore deforming the uv + is unsigned :confused: