Fragment shader normal artifacts appearing on specific GPU

I am calculating normals from an RGB encoded height map:


float unpackFactor = vec3(256.0 * 255.0, 255.0, 255.0 / 256.0);
float unpackOffset = -32768.0;

Therefore I edited the phong shader built-in dHdxy_fwd() function:

vec2 dHdxy_fwd() {
    float texelSize = 1.0 / 256.0;
    vec2 dSTdx = vec2(texelSize, .0);
    vec2 dSTdy = vec2(.0, texelSize);

    float Hll = bumpScale * dot(texture2D(displacementMap, vUv).rgb, unpackFactors) + unpackOffset;
    float dBx = bumpScale * dot(texture2D(displacementMap, vUv + dSTdx).rgb, unpackFactors) + unpackOffset - Hll;
    float dBy = bumpScale * dot(texture2D(displacementMap, vUv + dSTdy).rgb, unpackFactors) + unpackOffset - Hll;

    return vec2(dBx, dBy);

The decoding of the height, unfortunately, causes artifacts around the green channel of the texture – happening on iPhone 7, iPhone XS and the Radeon 455 dedicated GPU of my Mac:

Those artifacts look like this:

Zoomed in:

On the Intel HD Graphics 530 (integrated GPU), however, there are no such artifacts visible:

Zoomed in:

This post is referring to “Compensate bump map artifacts from customized shader” – however, I now found out that the code actually works perfectly fine on at least one GPU/hardware configuration.

Why are artifacts appearing on some GPUs? Any idea how to get rid of them? Seems like some numerical instability, but I fumbled around with texture precision, compressing the total height value, etc. with no luck yet.

Yes, I think it’s a floating point precision related issue. You are probably hitting hardware limitations which are usually only fixable by implementing different approaches.

Have you considered to store the height data in a floating point texture so you don’t have to perform the RGB decoding?

1 Like

Setting texture.type = THREE.FloatType; fixes the artifacts completely, without anything else to do. However, on Smartphones (I can test on an iPhone XS), FloatType is not supported, only HalfFloatType.

I’m really trying to get into the matter; how precisely do you mean “don’t have to perform the RGB decoding”? It feels like I’m so close to a working result but I’m stuck. What would be a proper way to retrieve the overall height value from a HalfFloatType texture within the shader? It’s totally ok if I lose some precision (i.e., high frequency bumps) in the process.

Also what’s weird is that if I use HalfFloatType, the artifacts improve on the MacBook (albeit still visible) – but on the iPhone it looks as bad as without using HalfFloatType at all.

EDIT: I also have to say that I receive the textures RGB encoded as .png files from a server.

I have misunderstood your code, sorry. I though you would encode a single float into a RGB value and then decode it back in the shader. This kind of RGB(A) packing is also used in three.js.

I have never seen a height color map. They are normally grayscale maps. What is the purpose of it?

The purpose is to cover a broad range of elevation levels with high enough precision. By distributing it over R, G and B channels, it’s possible to cover elevation from 0 to 2^16 meters with 1/256 m decimal precision.

1 Like

I’m pretty sure I can narrow it down that: magFilter = LinearFilter on the height map interpolates among the RGB channels in a way that cannot – without creating numerical instabilites – be recovered from.
By setting it to NearestFilter, the artifacts disappear but with them also the smooth appearance of the normals.

Do you see a reasonable possibility to smoothen out the normals in this approach? Without having a smoothened bump map beforehand? I tried this before (using the approach described in, without much success.