I am working on a WebGL project using javascript and the three.js framework. For that I am writing a custom shader with GLSL in which I have to load several lookup tables. Meaning I need to use some textures’ individual RGBA values for some calculations rather than displaying them.
This works fine on all devices that I’ve tested. However, on iOS devices (like an iPad) the RGB values of a texture are automatically set to 0 when its alpha channel is 0. I do not think that this is due to GLSL’s texture2D function but rather has something to do with how three.js loads textures on iOS. I am using the built-in TextureLoader for that:
var textureLoader = new THREE.TextureLoader();
var lutMap = textureLoader.load('path/to/lookup/table/img.png');
lutMap.minFilter = THREE.NearestFilter;
lutMap.magFilter = THREE.NearestFilter;
lutMap.generateMipmaps = false;
lutMap.type = THREE.UnsignedByteType;
lutMap.format = THREE.RGBAFormat;
For testing purposes I’ve created a test image with constant RGB values (255,0,0) and with a constantly decreasing alpha value from the top-right corner to the bottom-left one with some pixels’ alpha values being 0:
After the texture was loaded, I checked the zero-alpha pixels and their R values were indeed set to 0. Also, on the iOS device the texture2D(…) lookup in the GLSL code returned (0,0,0,0) for exactly those pixels. In the javascript code I used the following code to read the image’s data before sending it to the shader:
function getImageData( image ) {
var canvas = document.createElement( 'canvas' );
canvas.width = image.width;
canvas.height = image.height;
var context = canvas.getContext( '2d' );
context.drawImage( image, 0, 0 );
return context.getImageData( 0, 0, image.width, image.height );
}
The strange thing was that this was also true IN THE JAVACODE on my Windows PC, but the shader works just fine. So maybe this is only due to the canvas and has nothing to do with the actual problem. (Please note that I come from Java/C++ and I am not very familiar with javascript yet! )
I’ve also tried to set the premultipliedAlpha flag to 0 in the WebGLRenderer instance, but also in the THREE.ShaderMaterial object itself. Sadly, It did not fix the problem.
Did anyone experience similar problems and knows how to fix this unwanted behaviour?