Why doesn't my canvas texture look crisper and accurate when I zoom it?

Hi all, I’m using a 8192x4096px image as a texture on glb sphere. When I zoom the camera I don’t see the details as what we excpect while normally zooming high res image. Rather, even on normal view distance the texture appear to be less crisp and slightly blurry as compared to original image. it seems like the threejs or R3F is sampling the pixel values in improper way especially at Egypt, US and other. I try adding even more high res image but it created the same result.

CodeSandBox - ol-R3F - CodeSandbox
This is the actual image which I’m using as texture - https://static.experientia.in/nasaBlackMarble/2016_nasaBlackMarble_8K_bin.webp

Please compare both of them at Egypt (I mean in demo and texture image)

From last month, I’m trying to debug this issue, I thought maybe its OpenLayers’s fault as I’m projecting map on a sphere via canvasTexture and the answer is no because I can see OpenLayers’s Canvas rendering clear image but I’m not understanding what’s happening when I use it over R3F.

Can you please help me with this. I tried almost everything from last 2 weeks but nothing seems to work.

If you want a pixel perfect image, try this:

<canvasTexture
                ref={textureRef}
                attach="map"
                image={canvasTexture}
                flipY={false}
                minFilter={LinearFilter}
                magFilter={NearestFilter}
                mapping={EquirectangularReflectionMapping}
                //anisotropy={10}
              />

factors at play here are the magnification filter and anisotropy:

https://threejs.org/docs/index.html?q=texture#api/en/textures/Texture.anisotropy

https://threejs.org/docs/index.html?q=texture#api/en/textures/Texture.magFilter

also import the filter:

import { EquirectangularReflectionMapping, LinearFilter, NearestFilter } from "three";

@tfoller I tried that too but its not accurate. Any other way to improve on it? You can try seeing the difference in actual image and codeSandBox demo linked in original post.

Define accurate. You wanted it “crisp” - this I believe is as crisp as it gets.

You are probably comparing a zoomed in version of the image in some software (browser, Photoshop?) to a zoomed in version of the texture in THREE and they somehow don’t match. There is no way for me to replicate that.

You need to provide both images as screenshots side by side, at the same zoom level, otherwise it’s hard to understand what you are trying to achieve.

@tfoller I can’t directly add the image here as there’ll be a compression but please do check this image out which I’ve used in the canvas as canvasTexture.
https://static.experientia.in/nasaBlackMarble/egypt_middleEast_2016.webp

To really understand what do I mean by crisp/accurate please try zoom the codesandbox demo and the image at Egypt. Threejs doesn’t seem to preserve the detail while zooming.

Do you see this error in the console as well?

THREE.WebGLRenderer: Texture has been resized from (27000x13140) to (16384x7973).

It looks like the demo is using dynamically generated data from OpenLayers rather than the 8K texture attached above. The dynamically generated texture does not have the power-of-two dimensions recommended for WebGL, and must be resize (down to a lower resolution) in order to generate mipmaps for use in 3D scenes.

I would recommend simplifying the scope of the problem, using the static texture and not OpenLayers; none of us have experience with OpenLayers. It will be easier to compare results without additional dependencies.

1 Like

@donmccurdy After our conversation on Discord, pretty much after I started using 8192x4096px image I didn’t got that error.

I have a solution in which you can replicate it on your end. This isn’t Openlayer’s issue but actually threejs itself. Just make a canvas with height and width of 8192 x 4096 px and add this image into the canvas according to this artical and use the canvas element as a texture to the sphere. You’ll get the same result as this thread mentions. I tried it yesterday.

I’ll soon (within few moments) update my codesandbox demo for it as well

@donmccurdy Here’s the codesandbox demo. It’s pretty much the same as you saw before.

Please try reloading the window few times if the display seems blank as it seems like codesandbox takes sometime in fetching the image.

with regards to my previous thread, please try zooming at Egypt. That’s where the issue is most apparent.

@tfoller @donmccurdy any suggestions that I can improve in even in my plain version? or is there any way that I can layer multiple texture on a sphere?

This plain version demo uses a texture that is 13500px by 6750px, not 8K.

Comparing the original against what I see in WebGL…

original:

webgl:

… I do not see a huge difference in terms of pixel resolution, other than some effect of the mipmaps. If you wanted to see pixel edges more sharply, try THREE.NearestFilter for the minFilter and magFilter settings

Applications like Google Earth will have very complex systems of streaming in prepared data from many textures gradually. Perhaps some examples exist doing simpler versions of the same thing in three.js, but in terms of results you can get from a single texture, I don’t see that there is a bug here. Colors are off, but none of the color management best practices are set up either so I haven’t debugged that.

Result when just loading the texture directly (no CanvasTexture) and applying it to a SphereGeometry in the three.js editor. I don’t know if I’m seeing more detail but the colors are improved with correct color management settings.

@donmccurdy I actually forgot to upload my 8k image which I was using locally and now its been uploaded.

One thing I wanted to know that is there other way to stack multiple images on top of one another with transparency support as a texture to threejs material. That way I can select which texture I want to show/ highlight country boundary based on scroll position of key strokes?

If I have to do this I blender I can use concentric UV sphere mesh but it’ll make the glb size to 5-15 MB

Pretty much anything is possible, you can write shaders to fade between textures, use clever render order tricks to create overlays, swap models out entirely, and on and on. On the more feature-rich side you have applications like Cesium.js, that do streaming renders at arbitrary levels of detail anywhere on the globe. Things can get complex pretty fast, depending on how much you’re trying to do. You can find a lot by searching “webgl globe” and related terms, maybe this is the kind of thing you’re looking for? OpenLayers and three.js | George Panagiotopoulos | JavaScript in Plain English

Oh shaders, I’ve heard a lot of good things about it that needs extensive practice. I did look out for cesium but its docs weren’t easy to get started as threejs and there’s no tutorial on it. There’s indeed Mapbox’s globe which they’ve designed to compete with google earth but the problem with all of them was I didn’t know if they support blender camera like my project (it’still in progress) because I want to animate it like satellite and fpv drone. And so far looking at their docs, I guess they dont.

I indeed sent you that article during our discord discussion, That’s the exact method I’m using right now. But Sure, I’ll be looking forward in future to implement those sort of crazy stuff since right now I’m doing all of these things alone by myself.

WebGL globe’s demo were interesting, I’ll look into their GitHub code.

Ah sorry, too many threads here :sweat_smile:

1 Like