Specular aliasing with high frequency normal map details

I’m running into what seems like specular aliasing, and I’m wondering if this is expected, and whether there’s any obvious way to mitigate it.

I have a shiny material with a normal map, which has some high intensity, high frequency details that represent “microbevels”.



This looks fine when there are enough pixels, but when viewed from afar (low density sampling), I’m getting a lot more sparkly aliasing than I expect.

The aliasing can be mitigated effectively by increasing roughness, or by overriding the texture2D call to use a mipmap bias. This has the unwanted side effect of reducing detail in other non-problematic areas of the texture.

Ideally, the “roughness” would automatically increase just for the edges, where there is a high rate of change in the normal, and when the pixel density requires it. I could maybe craft a roughnessMap and modulate roughness based on these conditions.

But that got me wondering whether ThreeJS is expecting to do this calculation already? Does ThreeJS take into account the normal delta when it decides the way incoming lights will be calculated? Or is each pixel modelled as a “facet”?

Would creating a simple LOD solve the issue? Just assigning a material without the glittery detail whenever object is further away.

Would creating a simple LOD solve the issue?

That’s a helpful tip, but in this case, no. Distance from camera isn’t the only thing that alters sampling density. Other factors, like rotation, also cause this. Creating LODs for every case where this happens would also be a significant amount of extra work (artisting and testing) I’d like to avoid if possible.

I’ve added a bit more detail to the original post above. I’m interested in pragmatic, systematic solutions, but also interested in discussion about:

  1. what is the root cause,
  2. whether ThreeJS intends to handle this edge case,
  3. whether ThreeJS (or any realtime engine) can reasonably be expected to handle this

In my limited testing, these sparkles seem to happen no matter the light source type (point, directional, rectarea, envmap).

Here’s my handwavey working explanation. Would love to be corrected.

Even though the normal map is being sampled with mipmapping enabled, each pixel/fragment only samples a single normal value (i.e. the return value of texture2D()). While mipmapping makes sure low density sampling reflects the detail from the original “level 0” texture, individual pixels still sample just a single vec3 from the normalMap texture. This sampled value doesn’t include any information about the variation of the four normals from the higher detail level.

So, when each pixel calculates its lighting, it assumes that the entire pixel has that normalMap value, effectively modelling each pixel as though it were a pixel-sized facet. Kinda like a disco ball. So when there are very sharp changes in the normal, you get little disco sparkles.

If this is correct, I was wondering whether that ThreeJS might be able to somehow detect the “spread” of a mipmap-sampled normalMap, and factor that in when deciding how bright the incoming light oughta be, effectively widening and flattening the BDRF by the appropriate amount.

There is a Nvidia paper (Mipmapping Normal maps (2006)) about how the spread of the “upstream” normal texels can be inferred by how much the box filter has shortened the normal. I don’t know if that could be brought to bear, but I thought it was a cool idea.

I have (just now) confirmed that the length of the vec3 value of texels in normal mipmaps gets significantly less than 1.0 when the “upstream” texels have a lot of spread.

Separately, I also notice that in lights_physical_fragment.glsl.js, there is a geometryRoughness calculation, which seems to be “roughing up” the material when the geometry normals are “bendy”, which makes a lot of sense. Anyone able to confirm whether there is an equivalent mechanism for textures?

I did a proof of concept for this, and it does seem promising. Please let me know if this has already been invented and/or implemented.

By using the “squashed” normal effect (above) as a mask to boost the roughness before lighting calculations (similar to how geometryRoughness works) the specular aliasing is significantly reduced, at least in this case. Because it only tweaks roughness, this approach just piggybacks on the hard work others have already contributed to Three’s energy-conserving shading.

Here’s what I found…

Top row: By software downsampling a difficult high resolution render, we have something like an “ideal” low resolution image to compare. On the right, the “texture roughness mask” is shown in red, indicating where the length of normal texels have been “squashed” by dissimilar normals being averaged together by the mipmap box filter. This red mask is showing something like 1.0 - length(normalMap).

Bottom row: the leftmost is the current render in r169. Note the undesirable aliasing/sparkling. The next three pictures show the material.roughness value being increased by textureRoughness multiplied by an arbitrary factor. To my eye, 4 looks best here, but maybe 6 is more accurate if we trust our “ideal” goal.

Importantly, pixels that don’t “need help” are not affected. Full detail texels (texture level 0), and mipmapped texels that faithfully represent their upstream (high res) texels, have a length of 1.0, which means we add no extra texture roughness, and rendering proceeds as normal.

Thoughts?

1 Like

Hate to bump this, I hope this isn’t too rude. Just wondering if anyone, especially PBR shader contributors, can weigh in on this. Is this problem expected? Is this approach something that ThreeJS contributors would be open to improving? Is a PR welcome?

I’d start with a codepen that shows the issue, and your solution. This gets everyone on the same page, and able to discuss the issue.

1 Like

PR’s are welcome but they wont get traction unless people can see it with their own eyes/the code.
Your writeup is good… but as you say, the handwavey aspects are things we could clarify/verify by looking at the shader source, and if verified would help getting traction:
https://ycw.github.io/three-shaderlib-skim/dist/#/latest/standard/fragment

Another option is to implement your proposal as an external class, and use that to demonstrate the need for it to be integrated into the core library… (although, spoiler alert, It’s somewhat unlikely that a solution involving precomputing masks and adding more complex branching logic to the shader, will get pulled in.)

2 Likes

Thanks for the advice and encouragement. Showing working code sounds like a good idea, even if it’s just a rough hack showing the mechanism working.

To clarify, no precomputed masks, extra texture lookups, or branching are required. Just a few extra instructions to adjust roughness before lighting is calculated. And only when a normal map is present.

We’d measure, of course, but my hunch is that the shader cost would be negligible. This adjustment could also easily be in an opt-in ifdef, disabled by default.

1 Like

I think there used to be a way to generate a normal map and roughness map pair with mipmaps from a root normal map but I don’t seem to be able to find that example, now. @donmccurdy may recall?

This is another issue from this forum on the same / similar topic (specular aliasing at subpixel geometry detail):

2 Likes

RoughnessMipmapper, but it was removed in three.js r137. I’ve kept around some notes and links to the original code here, with the idea of someday adding the option to glTF Transform:

4 Likes

Thanks! I’ve made an issue here to track some solutions in case you have any more context to add.

1 Like

Seems promising, I’ll take a look! Out of interest, was this removed because of compatibility burden or some other reason?

I’m not sure but I think it’s mostly what’s described in this issue – maintenance burden and limitations affecting production use (not compatible with compressed texture formats).

1 Like

From that issue:

tell anyone who wants it to use KTX2 instead and hopefully a tool exists that will generate mipmaps properly rather than blindly

Does gltf-transform have the ability to generate normal / roughness mipmaps and embed them in KTX2 textures? On-the-fly generation is great but it does make most sense to embed these things in the final asset, instead.

I haven’t implemented this in glTF Transform so far. No particular reason, just limited available time. If there are example(s) of geometry and textures that would benefit (glTF files or otherwise) that would be a helpful addition on the feature request.

For an offline implementation I probably wouldn’t generate the maps on the GPU — high cost to maintain in Node.js — but would instead do something similar to how I convert glossiness textures to roughness textures here:

2 Likes