Mipmap generation failing on certain texture sizes causing minification aliasing

Thanks, I had a look at docs and tried this before posting the question. I created a set of custom mipmaps by using a canvas element to resize the original texture, this looked similar to what Three does under the hood as there was no change.

I then tried blurring the mipmaps progressively but so much quality was lost it just became a block of colour and ruined the appearance.

I’m wondering if there is any other process I can do when creating these mipmaps to avoid the emergent pattern? Aside from blurring / adding noise I can’t think of anything.

If you zoom out on the cube, it does eventually go away and becomes a solid color. It seems like… perhaps generating the mipmaps as a blend between the original texture, and a solid color… or… creating a pre-scaled version of the texture in photoshop with blur, and generating mips from a crossfade of the original and prescaled might do something good.

I’m falling asleep tho, but the glitch is there, so maybe someone else will take a look and have some ideas. :slight_smile:

1 Like

I notice the pattern also seems to jump out subtly when I resize to 25% in Photoshop (just using the magnify tool). But when I 50% downsize the image twice I see the image very clearly. (I have increased the contrast for clarity.)

Zoom 25%

Resized 50% + 50% (i.e. 25%)

Downsizing only once, directly to 25% gives almost identical results. I also tried all the resampling algorithms. Same issue. Really dang curious.

2 Likes

yeah. it is an interesting artifact. I think tho that it would even occur in real life if that pattern was like… 3d printed at the same resolution.

1 Like

Applying a 1px box blur before doing the resize seems to eliminate the issue, at least in photoshop.

Original left, box-filtered right:

Here’s the resized result with the filter applied (to the original full size texture).

Looking closely at the unfiltered pixels, there is a prominent horizontal frequency that’s really close to a 4px wavelength. Here’s a section of a single row of pixels, stretched vertically for visibility and offset by 4px. You can see there’s quite a lot of “resonance” at that offset, a lot of features overlap.

2 Likes

Yup. its a textbook worst case for texture filtering.

A related technique (sometimes referred to as:

Macro/Micro Texture Blending )

is where you blend in the same texture at different magnifications at larger distances to hide the repetition.

1 Like

Honestly, I don’t find any of this a complete or satisfying explanation, because I didn’t realize that texture filtering was fallible (susceptible to aliasing) like this. I’m actually a little bit troubled by this “pathological” texture, it’s kinda blowing my mind.

But “try blurring the texture a little bit” might be a practical workaround for @ziggy-orbital. This fix reminds me of how digital audio plugins often apply a low-pass filters before downsampling, to avoid introducing aliasing. If you don’t do this, frequencies above the final sampling frequency are reflected downward, which sounds terrible.

I’m just surprised that the mipmap downsampling isn’t already taking care of this by virtue of its 2x2 box filter, which is basically a lowpass filter. :thinking:

1 Like

I understand your dissatisfaction, but I would counter that there are plenty of examples in real work materials where effects like this happen. We don’t normally see it on real couch cushions because they aren’t textured with perfectly repeating sections of identical noisy cloth.

I would also be more disturbed if the pattern wasn’t relatively stable… like the true moire you get from unfiltered textures.

The only “real” solution would be to texture that couch with a non-tiled, photogrammetric scan of the full couch surface.

It’s easy to forget that Everything we are doing in 3d is “fake”.
It’s just that this particular sacrifice (that was made for performance i.e. tiling textures) breaks the illusion in this case.

2 Likes

Why do you think photoshop’s 25% preview looks “pretty good”, but its 25% image resize operation looks terrible, with all the algorithms (nearest, linear, bicubic)? There’s really quite a big difference in the outcome.

This is clearly not a ThreeJS-specific issue, but it still feels like an aliasing phenomenon we should be able to “name and explain”. I agree it’s not quite moire, since it’s stable…

2 Likes

I asked chatgpt about it:

“a friend is using threejs and is seeing this weird effect on a fabric texture when zoomed out, a subtly different pattern emerges. its not a moire like you get from bad mipmapping… but its weird. is there a name for this?”

ChatGPT said:

That sounds like anisoaliasing or aliasing from anisotropic filtering limitations. It happens when high-frequency details in a texture (like a fine fabric weave) interact with the screen resolution and perspective distortion, causing unintended patterns to emerge at a distance.

It’s different from classic Moiré (which is usually from bad mipmapping or insufficient texture resolution) and instead happens due to how anisotropic filtering samples textures at oblique angles. If the texture has fine, regular details, lower mip levels might not capture them correctly, leading to subtle emergent artifacts.

Possible fixes:

  1. Increase Anisotropy:

js

CopyEdit

texture.anisotropy = renderer.capabilities.getMaxAnisotropy();

This helps preserve more texture detail at oblique angles.
2. Reduce High-Frequency Detail:
If the texture has a tight weave pattern, a lower-frequency version might reduce artifacts.
3. Check UV Distortion:
If the fabric is on a dynamic surface (like a cloth sim), small distortions in the UV mapping can create unexpected patterns.

If none of these solve it, a screenshot might help diagnose it better.

1 Like

Thanks for all of your help. Is Three.js not already blurring the image when it creates the mipmaps? Did you find that the 1px box blur completely resolved the issue?

1 Like

Any changes to anisotropy have not yielded any results for me unfortunately.

1 Like

The problem here is we run into serious gpu limitations on mobile, I am going to try and use the 1.4m scans we have of these types of fabric, although the pattern seems innate to the material.

1 Like

Replying to myself, the mipmap generation is handled by the GPU driver.

2 Likes

The emergent pattern resembles a woven pattern. Can your ideal process be procedurally-generated? A more resilient lossless edit of 10,000 64px nanofibers?

< :cat_face: />

Changing the texture to a square resolved the issue. Problem solved :slight_smile:

1 Like

Oh crap. Good catch! :smiley: well that was fun.

1 Like

I think actually I misread the documentation. I thought power of two meant multiple of two, but it obviously means power of two, easy mistake to make.

1 Like

Still confused as the square image I’m using allows the mipmaps to be created, even though its not a power of two. It is 2000 * 2000 pixels.

The documentation is a bit confusing on this, I’m still not sure weather any square is fine, or a square OR power of 2.

1 Like

I’m removing my solution as in WebGL2 non power of two textures are meant to be mip-mappable, I still can’t figure out why certain sizes are mip mapping and others are not.

1 Like