I have an ar.js scene (three + aframe + ar.js) that centers around a simple plane with a custom shaders that is meant to give it the feeling of flying over a glacier:
I’ve just added a second plane above the first and given it a super simple custom fragment shader, my first step in trying to make a layer of clouds. Anywhere that the cloud layer’s gl_FragColor is partially transparent, and where I would expect to see the colors of the glacier passing through, they are not. Nothing. I assumed that scene builds up the objects after computing their shaders, but the topmost makes everything below irrelevant.
For every custom shader in the scene, do I have to somehow calculate and pass in what would be seen below? That seems crazy. Is it the order of the objects being rendered? But that would change based on what’s closest to the camera.
Or is there a property of the object, or a standard chunk of shader code I need to include… that everyone knows about but me?
I realize there’s fundamental stuff I’m missing here and can’t find in the docs. Looking forward to understand this awesome world just a tiny bit more
Also the tear in that photo is not AR. That’s a real tear.
Thank you!