Problem with very transparent objects

Hi Y’all,

I am trying to overlay hundreds/thousands of simple transparent objects to make interesting textures.

I am having an issue where the opacity does not seem to behave as expected when set to a value lower than around 0.02

If using black objects on a white background, the objects collectively won’t approach black. They will stay at a gray no matter how many are stacked.

Here is a JS Fiddle with an example

I moved the circles around a little so they aren’t in the same place.

I have also tried many other materials and material settings with no luck. It always produces this result.

Using alphaHash does allow the opacity to go very small and still behave as expected. But it’s a really different look to what I am trying to achieve.

If anyone has any advice or ideas it would be greatly appreciated!

[…] overlay hundreds/thousands of simple transparent objects […]

skeleton-reaction

My dear brother in Christ, the only thing wilder than that single sentence is the age of the version of three.js you’ve decided to use :sob:


To answer your question though - in default three.js setup, you’re “allowed” to overlay only 2 transparent objects. If you’d decide to either use a custom transparency render order or the alpha hash - you won’t be getting anywhere near "hundreds / thousands objects”, especially if camera zooms close onto the transparent materials in which case performance will get obliterated by overdraw.

The right approach to create interesting textures in this case would be to simply create a custom shader that renders one layer on top of another. That way not only you’d get a slightly way simpler code, but also be you’ll be able to control when to stop drawing further layers, skipping unnecessary calculations.

4 Likes

Oh yep haha that is very true. Although I can confirm that the JS Fiddle example has same the behavior as the latest version, v0.180.0

Performance isn’t my concern here. It’s only meant to run on one dedicated machine for an installation. I’m more interested in if it can work at all at really low opacities.

And sorry creating textures was just one example. I am more interested in the limits of transparency in Threejs in general.

Do you have any advice about the specific problem I am having?

If using black objects on a white background, the objects collectively won’t approach black. They will stay at a gray no matter how many are stacked.

The frame buffer used for the canvas is using 256-bits-per-pixel, which is not enough resolution to represent subtle changes from low-opacity blending. Likely what’s happening is that the low-opacity objects are blended, rounded to the nearest 256-bit value, and then at some point that rounding just rounds to the same value for each subsequent blend since the value change is so small.

Using an intermediate 16 or 32 bit render target for rendering should help improve the blending result.

when set to a value lower than around 0.02

To clarify - the value in your demo is set to 0.002 and 1 / 256 is 0.0039, so the opacity you’re using is just over half the representable color resolution.

4 Likes

You can use stochastic rounding to get around the resolution issue if you’re willing to accept a bit of noise.

something like this:

let opacity_resolution = 256.0;
let whole_opacity = opacity_01 * opacity_resolution;
let fraction = fract(whole_opacity);

var rounded = floor(whole_opacity);
if( random() < fraction ){
   rounded += 1.0;
}

let new_opacity_01 = rounded / opacity_resolution;

It’s a fairly old trick.

For random you can use whatever you like. There is a plethora of hash functions out there. Here’s a basic one to get you started:

fn random(){
   return abs( sin(coord.x + coord.y*0.123 + coord.z*0.79));
}

where coord is fragment’s position.

3 Likes