Texture RGB values are zero when ALPHA is zero on IOS

I am working on a WebGL project using javascript and the three.js framework. For that I am writing a custom shader with GLSL in which I have to load several lookup tables. Meaning I need to use some textures’ individual RGBA values for some calculations rather than displaying them.

This works fine on all devices that I’ve tested. However, on iOS devices (like an iPad) the RGB values of a texture are automatically set to 0 when its alpha channel is 0. I do not think that this is due to GLSL’s texture2D function but rather has something to do with how three.js loads textures on iOS. I am using the built-in TextureLoader for that:

var textureLoader = new THREE.TextureLoader();
var lutMap = textureLoader.load('path/to/lookup/table/img.png');
lutMap.minFilter = THREE.NearestFilter;
lutMap.magFilter = THREE.NearestFilter;
lutMap.generateMipmaps = false;
lutMap.type = THREE.UnsignedByteType;
lutMap.format = THREE.RGBAFormat;

For testing purposes I’ve created a test image with constant RGB values (255,0,0) and with a constantly decreasing alpha value from the top-right corner to the bottom-left one with some pixels’ alpha values being 0:


After the texture was loaded, I checked the zero-alpha pixels and their R values were indeed set to 0. Also, on the iOS device the texture2D(…) lookup in the GLSL code returned (0,0,0,0) for exactly those pixels. In the javascript code I used the following code to read the image’s data before sending it to the shader:

function getImageData( image ) {

    var canvas = document.createElement( 'canvas' );
    canvas.width = image.width;
    canvas.height = image.height;

    var context = canvas.getContext( '2d' );
    context.drawImage( image, 0, 0 );

    return context.getImageData( 0, 0, image.width, image.height );

}

The strange thing was that this was also true IN THE JAVACODE on my Windows PC, but the shader works just fine. So maybe this is only due to the canvas and has nothing to do with the actual problem. (Please note that I come from Java/C++ and I am not very familiar with javascript yet! :slight_smile: )
I’ve also tried to set the premultipliedAlpha flag to 0 in the WebGLRenderer instance, but also in the THREE.ShaderMaterial object itself. Sadly, It did not fix the problem.

Did anyone experience similar problems and knows how to fix this unwanted behaviour?

Can you please share the image in this thread?

/cc

Sure, I’ve updated the main post :slight_smile:

Can you please test how the following fiddle is rendered on your device?

https://jsfiddle.net/vyzc2sgj/1/

As you can see the custom shader just uses the red channel from the texel as the fragment color. Normally, the entire plane should be red. How does it look on iOS?

As expected the bottom-left corner is black, as the alpha values are zero there.
photo5841625963669074033

Um, strange. TBH, this seems like a browser issue to me. three.js does not do something fancy with the texture, everything is basic stuff. Consider to report this as a bug to Webkit:

https://webkit.org/reporting-bugs/

Yes, could be possible. But it seems like an iOS specific problem. The error occurs on both, iPads and iPhones using Chrome and Safari

AFAIK, Chrome uses Webkit technologies on iOS, too.

I found an already existing WebKit bugreport concerning this exact problem:

Bugreport

I think i faced this issue before. In my case it was alpha pre-multiplication.

I ended up using a DataTexture I think. Draw png to canvas, then rip out raw pixel data, and push it into a DataTexture.

Apparently the issue occurs at a pretty low level in the iOS browser stack. So the loaded PNG data would be already faulty before I create the DataTexture.
The CoreGraphics module automatically premultiplies the alpha channel with the RGB channels. Therefore the true RGB values can’t be restored anymore (special thanks to MoDJ):

Well, you have other options. You can decode PNG yourself, there’s a handful of NPM modules that do that.

1 Like