Glad you got some use out of it! I’ve had some luck with screen door transparency, as well. Higher resolutions make the pattern much less noticeable nowadays, as well. Using blue noise for the the dither pattern might get an even less noticeable pattern, as well.
Why not render it in a hidden DOM with HTML/CSS, capture the view bitmap and project it on a plane in 3D as a texture
This isn’t actually possible in browsers without heavy manual solutions I don’t believe.
The planes issue laid out in this thread is actually a relatively simple case – you just have to manually sort and render them into the order you want, which renderOrder can be used for.
The moment we do that, and add more objects, the complexity becomes unmanageable. I mean, if there was a way to manage it, then I think WebGLRenderer would be doing exactly that. Yeah, the two planes is a simple example, but it’s a simple example taken from a more complicated app with a lot more objects everywhere.
Beyond Full HD or Full HD at the size of smartphones that pattern will get much less visible yes. Unfortunately at most common screens (still Full HD) it’s very noticeable, but smoothing it out with a proper technique seems almost like some holy grail of order independent transparency for WebGL, without relying on multisampling and with the least cost i guess, even if the quality is a bit lower, as for larger surfaces it sure would blur anything behind it a bit, but that seems rather more of a feature
The most common issue with transparency people ask for help over and over again is having partially transparent objects like plants, self-intersections, sorting works good enough for objects like windows, but transparency on vegetation, hair and such is a different case i think screen door transparency is the best candidate, at least for WebGL right now.
The moment we do that, and add more objects, the complexity becomes unmanageable… Yeah, the two planes is a simple example, but it’s a simple example taken from a more complicated app with a lot more objects everywhere.
There’s not nearly enough information given to extrapolate out to whatever scenario you’re dealing with. I’d be happy to give more advice but your problem statement is too broad for a domain that relies on handling special cases. This post and your example show transparent parallel planes (or thin boxes) on top of each other without intersecting and a solution has been given with render order to handle that case which can be set dynamically, as well. You’re also provided the ability to override the sort function that three.js uses for transparent objects which may be useful. Regarding complexity of course you have more data and objects to manage but you can write code that handles sorting in a way that suits your app.
I mean, if there was a way to manage it, then I think WebGLRenderer would be doing exactly that.
My whole other post was about how there’s no single general solution for transparency. Three.js has picked a sorting solution that is fast and works “well enough” in the general case (I believe sorting based on distance to the world space origin of the mesh). If you have some notion of the “right” order in which to draw geometry that’s different you’re going to have to program that. But the tools should be there.
It would be sweet if by default the transparency mode in Three.js was one that prioritized accurate visuals (so the issue with the two-parallel-planes example wouldn’t happen), and that there were a option to tell Three which mode to use (perhaps an option passed to WebGLRenderer?).
I don’t think there is any method that consistently gives better visuals without a really unacceptable hit to performance. I’d like to see three.js include an alpha hash or “dither” transparency mode as an alternative:
That can be more predictable than alpha blending, and easier to set up, but the screen door pattern means it’s only visually “better” in certain situations and not others.
Also note that if you’re setting .transparent = true on a material, you should generally be setting .depthWrite = false as well. This is often overlooked.
Filtering the dithered areas again can give very good results, but it requires post processing.
Like i said i wouldn’t use it as general solution, as windows for example work fine, but as solution for partial transparent and self intersecting objects, you won’t get any working transparency out of box here except alpha test in the latter case, screen door transparency is alpha test just with dither patterns according to the alpha value.
Does the location of the object center play any part in the 3JS decision to draw an object? Or is the decision based on the location of individual faces from all objects?
If the former, could you influence the drawing decision by moving the object center toward the viewer?
Location of the object center affects the order in which objects are drawn. If depthWrite is enabled on the material, then the order in which objects are drawn may cause objects to disappear. It’s usually better to avoid that relationship and disable depthWrite on transparent materials, per https://github.com/mrdoob/three.js/pull/18235.
The order of individual faces within an object also matters, if the object occludes itself, but it’s usually not practical to sort individual faces, and three.js does not attempt to do so.
I have been working on a simulation where island objects are just placed just above the water. However, as I gain altitude, the islands start flashing because the program is having trouble deciding what to draw first. I solved that by raising the islands in the air by a certain % of my altitude. (There are no shadows, so you don’t notice.)
However, this discussion made me wonder if I could achieve the same result by raising the center of the island to, say, 100 feet above sea level.
Transparency may share a similar issue. So I wonder if you could do something similar by turning the transparent texture into a separate object. You could either offset the object center towards the viewer and/or have the program do so. That may not be practical for general use, but might be helpful in certain limited situations where maintaining the proper transparency is important.
Ah, that sounds like an issue of limited precision in the depth buffer — it certainly could affect either transparent or opaque objects, yes, and what you describe sounds like a good trick to keep the depth buffer functional for distant objects. But, I don’t think transparent objects are any more prone to that particular issue than opaque objects.
I have this case a lot and taking off the transparent = true and putting alphaTest = 0.5 can fix this issue that work for me i use models with the dae extension
for(let i in newInterface[“dae”][“alpha”]){
// This is an array with the name objects
let path = eval(jsonPath[newInterface["dae"]["alpha"][i].name]);
// This is an array with the alpha image objects
let texturePath1 = newInterface[“dae”][“alpha”][i][“texturePath”]
let texture = new THREE.TextureLoader().load( texturePath1 ); // Here i load the texture
path.material.alphaMap = texture // Here i put the texture
path.material.alphaMap.magFilter = THREE.NearestFilter
path.material.alphaMap.wrapT = THREE.RepeatWrapping
path.material.alphaMap.repeat.y = 1
path.material.alphaTest = 0.5 // And here i put alphaTest on .5 and this works for me
}
And this is how i pick the object and load the texture