Hi,
This png image looks too gray when rendered with small size.
When the camera is close, it looks like this, which is nice:
But when the camera moves farther, I get this:
What I want:
I guess it’s mipmap’s fault. Maybe the sampling way is not good so it takes (0, 0, 0)
rgb color from the transparent pixel to mix with other pixels.
Set mipmaps with specific sampling way manually is hard for me. And it may be ineffient as OpenGL generates mipmaps on video cards directly.
Any suggestion will be great. Thanks very much.
you can set the minFilter of texture,see this https://threejs.org/docs/index.html#api/en/textures/Texture.minFilter
I think the following post at stackoverflow answers this question:
It’s in context of OpenGL but I think it’s also true for WebGL. WebGL just uses the underlying native 3D API e.g. OpenGL, DirectX etc.
What kind of minFilter way could solve this theoretically? I tried every kind of filters and none of them gives me better result. `(>﹏<)′
Can’t just just modulate the color of the material in order to fine-tune the result?
Besides, this could also be a color-space issue. If your renderer has a configured output encoding, you also have to set Texture.encoding
to a proper value. Most probably THREE.sRGBEncoding
.
The texture is given by clients. They found that the render result is different from their modeling software. The images about what I want come from that software. We use the same shader code, so the render results look same at most of time.
I use my own shader material and do gamma correction myself. I add a image rendered with closer camera and the color is nice. So I think this has no relation with color space.
Finally, I decide to compute mipmaps myself for png image, because the alpha channel must be taken into account : https://entropymine.com/imageworsener/resizealpha/