Points transparent textures depth artifacts (soft particles)

Hmm, thank you! Approach with THREE.depthTexture is a lot simpler than my way of playing around with MeshDepthMaterial. I have the depth texture of the solid objects ready to be passed to my soft particles shader, but I dont know how to properly read the depth of my particle in the shader. If I understand the soft algorithm correctly, I need it in order to compare both of them to check how close the particle is to the scene meshes (detect edges and smooth out the alpha).

It’s gl_FragCoord.z. It think it makes sense to linearize this value and the value from the depth map if you are going to compare them. You can do this by including the packing chunk into your shader and then do this:

#include <packing>

float getLinearDepth( fragCoordZ ) {

    float viewZ = perspectiveDepthToViewZ( fragCoordZ, cameraNear, cameraFar );
    return viewZToOrthographicDepth( viewZ, cameraNear, cameraFar );

}

The following example uses this code: https://threejs.org/examples/webgl_depth_texture

As you can see, you need to pass in cameraNear and cameraFar as uniforms to the shader.

Thank you so much! :star_struck:
I finally got it to blend nicely :heart_eyes:

image

Thank you, wouldn’t have done it without your help!
Also btw I think DepthTexture, just like WebGLRenderTarget, uses min and mag filters (unless the docs have it wrong)

5 Likes

Would you like to share your code as a new three.js example? I think it would be good to have a soft particles example in the repo. It could be named as webgl_points_soft.

3 Likes

Sure, would love to give back to the community! :heart_eyes:
I’d have to clean up the code first, as its a bit messy right now. Should I post it here?

2 Likes

Yeah, why not. We can start to review the code here and finetune at github.

1 Like

Hmm should I paste the whole code here or just a .zip?
softParticles.zip (1.5 MB)
I have to say, Im not sure what should go into the uvTransform uniform of the shader material.

1 Like

Let me have a look at your code tomorrow :blush:

  • I’ve seen that you have copied the entire points material shader. I personally would keep the example more simple and remove all shader chunks which are not directly related to soft particles.

  • If you are writing a custom shader, it’s important to manage the file like so. Ensure to define all uniforms in the uniforms section. The usage of type is not necessary.

  • When you create a shader material based on your custom shader, do it always like shown in the following listing. It’s very important to clone your uniforms, otherwise it can have very strange side effects if more than one custom material are created with this shader.

this.customMaterial = new THREE.ShaderMaterial( {
    defines: Object.assign( {}, THREE.CustomShader.defines ),
    uniforms: THREE.UniformsUtils.clone( THREE.CustomShader.uniforms ),
    vertexShader: THREE.CustomShader.vertexShader,
    fragmentShader: THREE.CustomShader.fragmentShader,
    // now set common material properties
    transparent: true,
    depthTest: false,
    depthWrite: false
} );
  • The code of your example does not respect the three.js code style, yet. Ensure to lint your code with the mdcs setting, don’t use ES6 statements like let and structure your demo like official ones. Notice how functions are defined or at what place init() and animate() are executed. Also avoid the usage of flags that controls optional features of the demo (like your shadows variable)

  • You are using THREE.DepthTexture correctly but you perform an unnecessary render pass. After configuring your render target, you can render the diffuse and depth information via a single pass. Right now, you use renderer.render() to save the depth information and then use an instance of RenderPass for the beauty pass. The latter one is not necessary. Consider to manage your code in a SoftParticlesPass similar to SAOPass.

3 Likes

Hello, I’ve got this far today, will do more tomorrow.
softParticlesExample.zip (1.5 MB)
I’ve narrowed the whole scene down to objects necessary for the example, and reformatted most code. Also, could you please expand on your last point? Im not sure how to implement that.

2 Likes

I mean you create a new file SoftParticlesPass and implement the rendering logic related to soft particles there. You don’t use EffectComposer in SoftParticlesPass but manage the render targets by yourself similar to SSAOPass. This should provide you the flexibility you need in order to save the mentioned render pass. In your app/example, you could use the pass like so:

var softParticlesPass = new THREE.SoftParticlesPass( scene, camera );
softParticlesPass.renderToScreen = true;

var effectComposer = new THREE.EffectComposer( renderer );
var effectComposer.addPass( softParticlesPass );

I’m not sure this is the best approach but you need to find a way in order to get rid of the additional render pass. Setting the depth texture uniform of your the soft particles material is a bit tricky in SoftParticlesPass. Consider to use a method like SoftParticlesPass.addPoints() in order to tell the pass what point objects should be used.

BTW: Your code looks much better than before!

Thank you!
Unfortunately this seems a little too much for me :confused:. I’ve never made a custom pass and I have no idea how to handle rendering just the particles. Also Im not sure why in the DepthTarget.texture (main render target that supplies the depth texture to the softParticlesMaterial) the clouds are actually showing, but only those whithin 2 or 3 units of distance from the camera:
image
(there is the normal amount of points here, but only one or two of the nearest are visible)

Sorry, Im really not well versed in this topic :s

Okay, depending on how much time I have this week, I try to make the changes and host a fiddle with the code. The current progress is already great but let’s see if we can push things a bit further^^.

2 Likes

I have ported your code to a fiddle and cleaned it up a bit. Now it’s easier to work with the example:

https://jsfiddle.net/m7tvxpbs/

Unfortunately, I was not able to manage yet to avoid drawing the scene twice. You can see the problem right now how the render function works. It’s necessary to render the scene once in order to save the depth information. Then you have to render it again together with the soft particles. I think it’s possible to draw the scene once, then the particles and composite both results into a final image. But like I said before, I was not able to do so since the blending did not work so far. At least, we have now a live example so others can play around with the code.

4 Likes

Since we only need the depth texture from the first render, would setting the renderTarget's format to THREE.DepthFormat help the optimization a bit? Or maybe is that not at all, where the heavy lifting is, and the change wouldn’t be even noticable?

Hey, I asked myself that very same question. I soon realized that since you need 2 passes anyway - it’s best to render both color and depth to a render target in the first pass, you use a depth attachment. Then you take the depth attachment and feed that into the particle rendering pass. I managed to gain a significant performance boost from that.

1 Like

Sounds good! Anyone wants to update the fiddle with this approach :innocent:

testing the jsfiddle and get some problem with the animation using chrome
[.WebGL-0x12ddcd309a00]GL ERROR :GL_INVALID_OPERATION : glDrawArrays: Source and destination textures of the draw are the same.

updated fiddle
https://jsfiddle.net/wfo5cvur/5/

1 Like

without renderTarget
https://jsfiddle.net/wfo5cvur/9/

1 Like