My ThreeJS Demo works on my Laptop but not on others

Hi, this is my demo on Nasa’s Black Marble with R3F

The problem that I’m facing is that this site works perfectly on my Laptop (hp cs15-1000TX) but I’ve asked few of my friends to test it out and turns out all of them are having different experience (some even had macbook pro). For them the site doesn’t scroll beyond few section but that hasn’t been the case for me.

Interestingly there were no error in the console when my friend tried it and I don’t understand how do I test and diagnose this issue.

Can you please checkout this site and let me know what can I do? Is it working fine on your machine?


Here you go. Firefox, 109.0 has errors and shows nothing:

Click to open the snapshot

Chrome, 109.0.5414.119 - I can scroll down to Russia. Is this the end? After reaching Russia, I cannot scroll down or up.

Click to open the snapshot

– Pavel

MBP, won’t scroll after a bit.

@drcmda and @PavelBoytchev That’s exactly what the problem is. I don’t understand how its only working on my device but not other’s.

Any suggestion on how should I debug this? So far I’m also getting stuck at Afghanistan in Edge, Chrome and Brave works perfectly for me. Firefox somehow refuse to get started.

My demo - (the video is compressed to fit the size limit and was choppy due to screen recording. Otherwise, its fine.)


Yes. Use another computer that replicates the problem. Then, you’d need a bunch of console.logs to trace silently what is going under the hood.

– Pavel

That’s exactly the problem that there’s no error except for firefox. So how to approach it


I do not use R3F. Maybe someone with R3F experience can give a better suggestion. For the time being, I can only propose:

– Pavel

Hi @drcmda and @PavelBoytchev, What I found out is my laptop is having 24GB of memory and most of the other laptops that I tested my site were under 16GB. The Chrome is utilising 4-5GB space (on my end) in RAM Jumping from 9GB being idle to 14GB as I’m using many 16K texture. Since, my memory size is big enough I guess chrome handled it.

I don’t know why Brave and Edge wasn’t able to work in same way as both are Chromium Browser. But I’m definitely sure that my site currently is very RAM hungry.

What I want to know is this (below) how I can dynamically preload and dispose the Texture on based on scroll logic to make it more efficient? Also, is this more better?

preloading in R3F


Desposing in R3F

const [texture] = useKTX2(textureFilePath)

@donmccurdy I’d too like to hear your view on it.

without seeing code impossible, but highly likely that you execute code that’s working on windows but not on mac and since your scroll stops im lead to believe that’s scroll events.

What I want to know is this (below) how I can dynamically preload and dispose the Texture on based on scroll logic to make it more efficient? Also, is this more better?

you should not. all textures need to be loaded and active, visible or not. if you dispose and add runtime depending on scroll position it will be worse, and there will be severe lag. i would rather make sure to compress everything, make the textures 1k, or 2k max, reduce the vertices of your assets etc. if it isn’t scroll events then maybe you’re using assets that should not be used on the web.


Maybe you could optimize the earth texture?!? You do not need a full texture for all the globe all the time. While the earth is spinning, it is so fast, that even a low-resolution texture would be enough. When motion is slow, you only need a texture of 1/8 of the earth (or less). Also, you could split the earth in sections and give textures to only some of the sections – you do not need textures for oceans, for example.

But before doing optimizations on the texture, make sure that this is the actual cause. Make a low-res texture and try it on other computers. If it works fine, then the texture size is an issue.

– Pavel

I too thought it might be some scrolling issue. But now that I’ve added support for key up and down, I got to know its definitely because of many texture that I’m using as mentioned below.

I’ll surely take your suggestion and try to experiment to see what works the best.

Full code avialble here - at 2db57dfc49c7defc55f389bc30fd420119735e26 · experientia-in/ · GitHub

  const china2012 = "assets/img/nasaBlackMarble/8k/2012_china.ktx2";
  const egypt2012 = "assets/img/nasaBlackMarble/8k/2012_egyptMiddleEast.ktx2";
  const india2012 = "assets/img/nasaBlackMarble/8k/2012_india.ktx2";
  const nasaBlackMarble2016 =
  const nasaBlackMarble2012 =
  const afg_hlg = "assets/img/nasaBlackMarble/8k/afghanistan_hlg.ktx2";
  const arg_hlg = "assets/img/nasaBlackMarble/4k/argentina_hlg.ktx2";
  const argRailway_hlg = "assets/img/nasaBlackMarble/4k/argentinaRailway_hlg.ktx2";
  const pak_hlg = "assets/img/nasaBlackMarble/8k/pakistan_hlg.ktx2";
  const rus_hlg = "assets/img/nasaBlackMarble/4k/russia_hlg.ktx2";
  const syr_hlg = "assets/img/nasaBlackMarble/8k/syria_hlg.ktx2";
  const yem_hlg = "assets/img/nasaBlackMarble/8k/yemen_hlg.ktx2";

  const [
  ] = useKTX2([

What you’re referring sounds very similar to GPU picking I’d assume that’s more complex in general. @drcmda is there any module in R3F for GPU picking?

I’d have done the way (layering on top of one another) you’re suggesting but what I’ve found out is that even your texture has transparency, that’ll still occupy same space in GPU memory as opaque ones so there’s not much benefit in doing so.

Vram in browsers is limited to 8GB max so I’m surprised this is working at all

Even with 64GB of physical RAM, running the following code in chrome will display 8GB available on desktop and 4GB for mobile browsers…

const memory = navigator.deviceMemory
console.log (`This device has at least ${memory}GiB of RAM.`)


You can also find sites that will tell you the available memory of the device you’re on such as this site if you scroll down to “live demo”

You’re going to have to find a way to limit your texture sizes if you want a smooth cross device and cross browser application, as @drcmda has said, disposing and recreating textures of this size is going to cause you more lag than anything trying to load them dynamically on scroll

In an ideal world webgl would function and render on a per pixel vram budget, Google maps for instance utilizes this approach, you can see a detailed insight into optimisation options regarding big webgl scenes here

1 Like

i wouldn’t use gpu picking, it makes things unnecessarily complex, you need to refactor the entire project around that. use this instead:

- <Scene />
+ <Bvh>
+   <Scene />
+ </Bvh>

though i don’t think any of this is your problem. you’re kind of guessing and that’s bad. your situation couldn’t be easier to solve, your laptop appears to be the only machine that can run it, so just borrow another and profile the issue. if you have the freezing thing in front of you one two console.logs will immediately tell you what’s what.

if i go to chrome devtools and profile the site it just runs, there are no huge chunks of processing time stopping the site from working. even if it completely freezes the frameloop happily plows on without interruption.

one thing that for sure is a bug, your useEffect doesn’t have cleanup. if for whatever reason it re-mounts (for instance strict mode, or suspense, …) then all side effects are left hanging, so you will have two wheel events, or three, firing at the same time. two or three or four gsap controllers, in other words race conditions, one piece of code pulls the camera here, one pulls it there, which could lead to the site going nowhere looking like a freeze. this would explain why the framerate is OK even after freeze.

This is only an example using timeouts, it goes for everything, intervals, raf’s, event listeners, if i see a useEffect without return () => i kind of have no faith in it working correctly …

useEffect(() => {
  // If you create a side effect ...
  const handler = setTimeout(() => ..., 1000)
  return () => {
    // You must clean it up here ...
}, [])


I’m not suggesting this. Transparency in textures does not save memory space. My suggestion was to split the Earth and its texture into chunks and keep in memory only the chunks that are visible.

When the site focuses on a country, you can use higher resolution texture for that country, all the rest could be low resolution (at least) or not in memory (at most).

Because this is not so straightforward to implement, first make sure that the texture size is the issue.

– Pavel

Sorry for late response, I was away from home so didn’t got to access my laptop.

Today itself, I was going through my Git commits I found this version - Vite + React in which I was using scroll events instead of wheel which is what in use right now and my friend also told me that this old (dev) version did worked on his laptop till the end. Only Problem with this version was the scroll logic was glitchy. So right now I’m comparing both the version and see where can I go with it by modifying few things.

Thanks for your response, I’ll surely implement that cleanup function and also update what’s the progress.

Edit - Firefox is giving me this warning

This site appears to use a scroll-linked positioning effect. This may not work well with asynchronous panning; see Scroll-linked effects — Firefox Source Docs documentation for further details and to join the discussion on related tools and features!

Hi there. If you have some minutes can you please check this question?

@drcmda one thing I forgot to ask is that can you please give me any suggestion what should I exactly look for in performance tab in devtools while testing. So far all the tutorial/blog that I saw were only focused on doing profiling for normal static sites.

@forerunrun Yeah I can see both my laptop and my Samsung A52 both having 8GB memory.

from all that i see this is a normal static site. profiling means you record a bit while moving the site and then zoom into the frame bars. but i don’t see anything wrong in the readout. what the FF link said sounds interesting, how do you change content position when scroll happens? transform: translate3d?