Texture a sphere from a mercator projected image

Hi,

I have a map in mercator projection and would like to texture a sphere with it.

What I have tried is : for each (x1,y1) pixel coordinates of a new canvas representing the equirectangular projection of the map, I compute the corresponding lat long coordinates and from that I compute the corresponding (x2,y2) coordinates on the mercator projection.
I can then put the pixel in coordinates (x2,y2) from the mercator projected image in the pixel (x1,y1) in the equirectangular projected canvas which I finally use as a texture for the sphere.

With a little tweaking it works okay but it really is not perfect and the performances are bad.

So do you have any idea of I could achieve what I want in a faster way ?

Thanks in advance,

1 Like

Yes. :smirk:

The better and faster approach, which leverages the power of the GPU is to use UV-Mapping.

A sphere in Three.js is approximated through a regular mesh having triangular faces and vertices (the endpoints of triangles).

Three.js provides data structures to attach uv-coordinates to vertices, like this:

You need to understand what uv-coordinates do, and then assign proper values of u-v-coordinates to each vertex. The interpolation for all the affected pixels of each triangle is performed by the GPU.

In order to find proper u-v-coordinate values it helps to understand the nature of a Mercator projection, which projects the features of a (semi-transparent) globe from a point-light source at the center of the globe onto a cylinder surface wrapped around the globe. See this illustration:
https://upload.wikimedia.org/wikipedia/commons/b/b5/Comparison_of_Mercator_projections.svg

Note, that this involves Math.tan() behaviour, which is why you never see the poles covered by a true (standard) Mercator projection.

If you can’t put all the pieces together, do come back and ask more specific questions.

1 Like

Thank you for your explications they helped me a lot !

I am going to try and implement a solution and will come back to you if I have any question.

I wondered if the fact that it simply interpolates for each pixels between the vertices usually created any artifacts or distortions ?

You came to the right place :wink:

Incidentally I have investigated such systemic distortions, which (as of my current knowledge) seem unavoidable, although I haven’t completely given up on the topic yet.

Very interesting work you have done there ! I think I will be okay with those distortions for the moment then.

I still have some trouble with the mapping though… From the xyz positions of the vertices I can easily compute the corresponding lat long coordinates, but computing the uv coordinates with the mercator projection is then a bit tricky. I follow those equations from wikipedia and I think my trouble comes from the tan behaviour you were talking about earlier but I can’t figure out how to fix it.

Basically when the latitude is above PI/2 * 0.9 (respectively -PI/2 * 0.9), I just set v as 1 (respectively 0), and when it is between those two values I simply apply the equation for y in the wikipedia page and map it from a range of [-2.542, 2.542] which corresponds to the y given a latitude of ±PI/2*0.9 to a range of [0, 1]

But the result is only partially good and I don’t know how to fix it.
Here is my implementation of the code.

I would like to take a step back, and have a look at the problem from a little distance.

In a Mercator projection, you have an imaginary point light source at the center of the globe.
You wrap someting like a flexible cylindrical screen around the globe, lets assume it touches the equator. This results in something like the following:

You see a cross-section of the globe, north and south pole shown, equator displayed as a fat blue line.
I’ve only shown one half of the problem, because it’s perfectly symmetrical WRT the globe’s N-S axis.

Note how a perfectly regular angular subdivision of the latitudes (here: once every 15°) translates to a hugely distorted image on the projection screen - the vertical magenta line. (This is the “tangent behaviour” I was speaking of).That’s the hallmark of the Mercator projection, btw… Which the British liked so much, because it subtly supported their claim of supremacy over the Indian sub-continent, because it made their tiny island appear much more significant, in comparison to the huge Indian landmass. I’m digressing …

It’s also important to note, that a Mercator projected map inherently can cover only a limited latitude range , because going all the way from north pole to south pole would require an infinite-height (tan(90°)) map, which is mathematically not possible. The example I’ve shown covers the range from 75° southern latitude up to 75° northern latitude.

Whatever the range: each map covers mapping coordinates ranging from (0,0) at their lower-left corner to (1, 1) at their upper-right corner, so they are normalized to whatever physical dimensions [pix] the bitmap may have.

The process of mapping is the inverse of projection.

WRT the top illustration, you’ll have to invert the direction of the arrows. Which will inevitably involve an Math.atan() behaviour.

What you’ll need to know:

what is the latitude range covered by your Mercator projected map? The map can’t possibly provide information beyond its range of coverage.

Points on the lowest covered latitude will have a “v”-component of u-v-mapping coordinates of 0.0.
Points on the highest covered latitude will have a “v”-component of u-v-mapping coordinates of 1.0.

Points at intermediate latitudes will have intermediate mapping coordinates, derived from the Math.atan() function.

For the “u”-value of mapping coordinates keep in mind, that the flat 2D Mercator-projected map has been unrolled from a seamless cylinder enclosing the globe and touching it at the equator. So the width of the unrolled map covers the range of 2 * Math.PI

The “u”-values of the u-v-mapping coordinates correspond to their fraction of a full circle. In your imagination, don’t go 180° east or west, but rather from 0° to 360° in a suitable direction.

Example:
Points on the 90° (East) longitude would all have a “u”-value of 0.25 (one fourth of the full circumference).

1 Like

I have to correct myself on my above quote - it’s plain wrong.

The way to find the “v” component of mapping coordinates in the above example is like shown in the following sketch:

I’ve replaced the “tan(value)” from my previous sketch with their actual results. For the example map covering the -75° … +75° latitude range, the result range has a height of

2 * tan( 75° ) = 7.464.

To arrive at the “v” component at, say, -60°, you take the share of its absolute distance from bottom in relation to the total value range: 2.000 / 7.464 = 0.268.

This topic is very interesting, but one needs to understand the math first. Some knowledge of Differential Geometry helps, but it is not necessary. This blog does not address exactly the same problem, but at least the math is correct: Converting Web Mercator projection to equirectangular

If you are lazy (like me): shouldn’t a Mercator map project correctly onto a sphere as long as you are projecting directly from the side? The top and bottom may not look pretty, but they should be mostly accurate…

Yes. I had no problem at all, and the result looks pretty good to me, even running on an iPhone 6E. But I just borrowed Three.js models. Everything else is plain WebGL. However, I sample the texture in the fragment shader… I mean, cartesian → spherical (equirectangular) → Mercator and finally sample texture.

https://krotalias.github.io/cwdc/13-webgl/extras/LightingWithTexture.html

1 Like