To answer your question though - in default three.js setup, you’re “allowed” to overlay only 2 transparent objects. If you’d decide to either use a custom transparency render order or the alpha hash - you won’t be getting anywhere near "hundreds / thousands objects”, especially if camera zooms close onto the transparent materials in which case performance will get obliterated by overdraw.
The right approach to create interesting textures in this case would be to simply create a custom shader that renders one layer on top of another. That way not only you’d get a slightly way simpler code, but also be you’ll be able to control when to stop drawing further layers, skipping unnecessary calculations.
Oh yep haha that is very true. Although I can confirm that the JS Fiddle example has same the behavior as the latest version, v0.180.0
Performance isn’t my concern here. It’s only meant to run on one dedicated machine for an installation. I’m more interested in if it can work at all at really low opacities.
And sorry creating textures was just one example. I am more interested in the limits of transparency in Threejs in general.
Do you have any advice about the specific problem I am having?
If using black objects on a white background, the objects collectively won’t approach black. They will stay at a gray no matter how many are stacked.
The frame buffer used for the canvas is using 256-bits-per-pixel, which is not enough resolution to represent subtle changes from low-opacity blending. Likely what’s happening is that the low-opacity objects are blended, rounded to the nearest 256-bit value, and then at some point that rounding just rounds to the same value for each subsequent blend since the value change is so small.
Using an intermediate 16 or 32 bit render target for rendering should help improve the blending result.
when set to a value lower than around 0.02
To clarify - the value in your demo is set to 0.002 and 1 / 256 is 0.0039, so the opacity you’re using is just over half the representable color resolution.