Convert camera frustrum to UV coordinate on texture?

Hello! First off, I’m not sure I’m asking this the right way. Please bear with me as I explain what I’m trying to do.

I’m looking to implement something similar to DZI (deep zoom image) but for panoramas. My current solution is creating a large number of sphere slices (adjusting phi and theta values) and texturing them with image tiles. This works for the most part but I think it uses more memory than necessary since I’m creating a bunch of sphere geos?

To optimize, I’m changing it to using only one sphere geometry with a large initial white dummy texture. And then using copyTextureToTexture() function to load tiles.

Finally, my question… I’m stuck trying to figure how to to detect which part of the sphere geo the camera is looking at to so I can update the corresponding tile. Any help is appreciate it!

You could use Camera.getWorldDirection() to get the view direction and then use raycaster to find out the intersection point on your sphere. To be clear, the origin of the raycaster’s ray would be the camera’s position in world space, the direction of course the mentioned camera’s view direction.

An intersection points always contains the uv coordinates of the intersected geometry. You can access it like so:

const intersects = raycaster.intersectObject( object );

if ( intersects.length > 0 ) {

    const intersection = intersects[ 0 ];
    console.log( intersection.uv );


Did not know intersections also contains uv coordinates. That would make a lot sense! Thanks!