I am implementing a water shader that uses a depth texture to render shorelines and water depth. However, I am having troubles with it on iOS devices, as it seems the depth texture I pass to the shader losses precision ( I think?).
You can actually see an example of this issue in one of the THREE.js examples (I set it to DepthFormat and UnsignedIntType):
On iOS devices (I tired an iPad and iPhone) you see an unsmooth depth representation (image on left), and on all other systems you get a smooth gradient (image on right).
Any ideas on why this issue occurs or have any solutions? Thanks!
What version of iOS are you using?
Can you please give
UnsignedInt248Type a try?
I upgraded my iPad from iPadOS 14.8 to 15.1 this morning and now the example works!
In regards to UnsignedInt248Type, this also had the same results as mentioned in the original post (“unsmooth depth representation”). It also requires you to use DepthStencilFormat for anyone who wants to try themselves.
I’m going to try this on an iPhone later this evening and I will update here for anyone who is curious.
Overall, it appears to be fixed via a software update for Apple devices.
However, I am still curious as to why this happened if anyone has an explanation. My only thought right now is that when passing the depth texture to the shader, something on iOS devices changes the precision of the depth data, but I don’t really know.
Thanks for the reply!
EDIT: iOS on iPhone 15.1.1 tested and it works there as well now. So I guess this is no longer an issue provided the end user has the most recent OS.
I’m reproducing this issue on iOS 15.4.1.
Tried the different texture types in the demo and there is no noticeable difference across them.
This “fixed” the depth buffer for me on iOS:
precision highp sampler2D;
I’m using the depth to draw outlines, and without the statement above I would get a bunch of “stripes” across the cameras Z axis, as the depth was seemingly changing abruptly at specific intervals (instead of increasing linearly).