Points transparent textures depth artifacts (soft particles)

Hello :slight_smile:
I’ve been playing around with fog effect, using THREE.Points with some transparent smoke textures. First problem came up when the fully transparent edges of the images would clip weirdly:

From this stack overflow thread javascript - Three.js - depthWrite vs depthTest for transparent canvas texture map on THREE.Points - Stack Overflow , I found out, that its due to depth testing. I set the material’s depthTest to false. It looked great, exactly like i wanted, but unfortunately since depth testing is off, the smoke would be visible through walls and any other objects:


(and behind a wall)

After switching to [depthWrite: false] I got more correct depth perception. The clipping is gone and walls obstruct the images. However, now you can see these ugly edges when it goes through a mesh:

I realize those are natural and should happen, but nonetheless it looks unpleasent to the eye. Im just trying to create a nice-looking fog effect :slight_smile: . Is there any smart way of avoiding these ugly edges in the [depthWrite: false] scenario? Maybe some clever custom shader material trick or post-processing effect?

Points cloud with textures looks really good and I’d love to make it work! :star_struck:
Thank you for any suggestions!

1 Like

I think you are looking for a technique called soft particles. From a Nvidia resource:

Particle sprites are widely used in most games. They are used for a variety of semitransparent effects such as fog, smoke, dust, magic spells and explosions. However, sprites commonly produce artifacts – unnaturally sharp edges – where they intersect the rest of the scene.
We present a simple way to fade out flat sprites against a 3D scene, softening the artificial edges. Two solutions are implemented: one uses the ability of DirectX10 to read the depth buffer as a texture; the other uses a more conventional second render target to store depth values.

@korner already experimented with soft particles right here: Soft Particles render

1 Like

Thank you a lot for these resources. http://blog.wolfire.com/2010/04/Soft-Particles explains this well, however Im having trouble implementing it in three.js, specifically with getting the depth value of the particles and passing the depth buffer to it’s shader. Is it possible to use the existing PointsMaterial modified or is custom shader a must?

1 Like

I would implement this in a custom shader since it’s better than hacking PointsMaterial. Besides, the shader code of PointsMaterial is straightforward so it’s no problem to copy the essential parts into your own code.

I got the shader up and running, but what would be the proper way to get the depth value of the particles, in the same scale/way as MeshDepthMaterial from my scene does? I tried to analyze its shader code, but it uses many [include] statements and I get a little lost :s

1 Like

I’m not sure you can use the following approach but the normal way is to render to whole scene with MeshDepthMaterial into a render target. You can then use the respective texture as a depth map for your shader. However, I suggest you use THREE.DepthTexture since the respective extension is widely supported. In this way, you can automatically store the depth buffer into a texture when rendering your scene. A typical setup looks like this:

const depthTexture = new THREE.DepthTexture();
depthTexture.type = THREE.UnsignedShortType;
depthTexture.minFilter = THREE.NearestFilter;
depthTexture.maxFilter = THREE.NearestFilter;

The respective render target:

const renderTarget = new THREE.WebGLRenderTarget( width, height, {
    minFilter: THREE.LinearFilter,
    magFilter: THREE.LinearFilter,
    format: THREE.RGBAFormat,
    depthTexture: depthTexture,
    depthBuffer: true
} );

After rendering, you can then use the depth texture as an input for your soft particles shader. Maybe like so:

shader.uniforms[ 'tDepth' ].value = depthTexture;

You can then read the depth values in GLSL like so:

float depth = texture2D( tDepth, vUv ).x;

Check out the implementation of SSAOShader to see a full working example.

1 Like

Hmm, thank you! Approach with THREE.depthTexture is a lot simpler than my way of playing around with MeshDepthMaterial. I have the depth texture of the solid objects ready to be passed to my soft particles shader, but I dont know how to properly read the depth of my particle in the shader. If I understand the soft algorithm correctly, I need it in order to compare both of them to check how close the particle is to the scene meshes (detect edges and smooth out the alpha).

1 Like

It’s gl_FragCoord.z. It think it makes sense to linearize this value and the value from the depth map if you are going to compare them. You can do this by including the packing chunk into your shader and then do this:

#include <packing>

float getLinearDepth( fragCoordZ ) {

    float viewZ = perspectiveDepthToViewZ( fragCoordZ, cameraNear, cameraFar );
    return viewZToOrthographicDepth( viewZ, cameraNear, cameraFar );

}

The following example uses this code: three.js webgl - Depth Texture

As you can see, you need to pass in cameraNear and cameraFar as uniforms to the shader.

Thank you so much! :star_struck:
I finally got it to blend nicely :heart_eyes:

Thank you, wouldn’t have done it without your help!
Also btw I think DepthTexture, just like WebGLRenderTarget, uses min and mag filters (unless the docs have it wrong)

7 Likes

Would you like to share your code as a new three.js example? I think it would be good to have a soft particles example in the repo. It could be named as webgl_points_soft.

3 Likes

Sure, would love to give back to the community! :heart_eyes:
I’d have to clean up the code first, as its a bit messy right now. Should I post it here?

4 Likes

Yeah, why not. We can start to review the code here and finetune at github.

1 Like

Hmm should I paste the whole code here or just a .zip?
softParticles.zip (1.5 MB)
I have to say, Im not sure what should go into the uvTransform uniform of the shader material.

2 Likes

Let me have a look at your code tomorrow :blush:

  • I’ve seen that you have copied the entire points material shader. I personally would keep the example more simple and remove all shader chunks which are not directly related to soft particles.

  • If you are writing a custom shader, it’s important to manage the file like so. Ensure to define all uniforms in the uniforms section. The usage of type is not necessary.

  • When you create a shader material based on your custom shader, do it always like shown in the following listing. It’s very important to clone your uniforms, otherwise it can have very strange side effects if more than one custom material are created with this shader.

this.customMaterial = new THREE.ShaderMaterial( {
    defines: Object.assign( {}, THREE.CustomShader.defines ),
    uniforms: THREE.UniformsUtils.clone( THREE.CustomShader.uniforms ),
    vertexShader: THREE.CustomShader.vertexShader,
    fragmentShader: THREE.CustomShader.fragmentShader,
    // now set common material properties
    transparent: true,
    depthTest: false,
    depthWrite: false
} );
  • The code of your example does not respect the three.js code style, yet. Ensure to lint your code with the mdcs setting, don’t use ES6 statements like let and structure your demo like official ones. Notice how functions are defined or at what place init() and animate() are executed. Also avoid the usage of flags that controls optional features of the demo (like your shadows variable)

  • You are using THREE.DepthTexture correctly but you perform an unnecessary render pass. After configuring your render target, you can render the diffuse and depth information via a single pass. Right now, you use renderer.render() to save the depth information and then use an instance of RenderPass for the beauty pass. The latter one is not necessary. Consider to manage your code in a SoftParticlesPass similar to SAOPass.

3 Likes

Hello, I’ve got this far today, will do more tomorrow.
softParticlesExample.zip (1.5 MB)
I’ve narrowed the whole scene down to objects necessary for the example, and reformatted most code. Also, could you please expand on your last point? Im not sure how to implement that.

3 Likes

I mean you create a new file SoftParticlesPass and implement the rendering logic related to soft particles there. You don’t use EffectComposer in SoftParticlesPass but manage the render targets by yourself similar to SSAOPass. This should provide you the flexibility you need in order to save the mentioned render pass. In your app/example, you could use the pass like so:

var softParticlesPass = new THREE.SoftParticlesPass( scene, camera );
softParticlesPass.renderToScreen = true;

var effectComposer = new THREE.EffectComposer( renderer );
var effectComposer.addPass( softParticlesPass );

I’m not sure this is the best approach but you need to find a way in order to get rid of the additional render pass. Setting the depth texture uniform of your the soft particles material is a bit tricky in SoftParticlesPass. Consider to use a method like SoftParticlesPass.addPoints() in order to tell the pass what point objects should be used.

BTW: Your code looks much better than before!

Thank you!
Unfortunately this seems a little too much for me :confused:. I’ve never made a custom pass and I have no idea how to handle rendering just the particles. Also Im not sure why in the DepthTarget.texture (main render target that supplies the depth texture to the softParticlesMaterial) the clouds are actually showing, but only those whithin 2 or 3 units of distance from the camera:


(there is the normal amount of points here, but only one or two of the nearest are visible)

Sorry, Im really not well versed in this topic :s

1 Like

Okay, depending on how much time I have this week, I try to make the changes and host a fiddle with the code. The current progress is already great but let’s see if we can push things a bit further^^.

2 Likes

I have ported your code to a fiddle and cleaned it up a bit. Now it’s easier to work with the example:

https://jsfiddle.net/m7tvxpbs/

Unfortunately, I was not able to manage yet to avoid drawing the scene twice. You can see the problem right now how the render function works. It’s necessary to render the scene once in order to save the depth information. Then you have to render it again together with the soft particles. I think it’s possible to draw the scene once, then the particles and composite both results into a final image. But like I said before, I was not able to do so since the blending did not work so far. At least, we have now a live example so others can play around with the code.

4 Likes