Geometry rendering differently on different devices

Hello all,

I’m currently making a small app and one of the components parses a .obj file, stores the position of each vertex in a texture, and recreates it using line elements that get update their positions based on the data in the texture.

The problem, however, is that while this works exactly as expected on my laptop, testing on other devices yields significantly different results, and I’m not entirely sure what the root cause is.

I’ve created a small example of the problem on CodeSandbox.io, and pictures can be found below of what the default bunny model looks like on my computer vs what I see on another device.

Output on device 1:

Output on device 2:

Any help would be appreciated!

Also a note for why I’m loading the geometry this way, as a small project to try out GPGPU, I made a script that simulates the obj as if it were made of one-dimensional springs, and each node is computed independently using GPGPU

EDIT:
Additional Information:
The working device is a windows laptop running Windows 11 Pro(build 22631.3880), with an Intel core i7-8650U and Intel UHD Graphics 620

The other device with the problem is a laptop with Windows 10 home(version 22H2), with an intel core i7-9750H and a Nvidia Geforce GTX 1660 Ti(31.0.15.3598)

Most of my testing has been on Firefox(Version 120.0, 64-bit), but the issue is persistent in Chrome and edge.

It’s also worth noting that I’ve had a few friends test on their phones/macs, and none so far seem to have the expected behavior, so the expected behavior is definitely the exception rather than the rule.

Using Three.js version 167

Different GPUs may run shaders at different precision, and something in your GPU physics simulation might be sensitive to that change in precision.

You might want to try setting the shader precision manually in your shaders, if you’re not doing it already… or… on the working platform, try explicitly setting to a lower precision and see if it causes the issue?

Also, driver behavior can have slight variations across “devices”. What are the actual devices/gpus/cpus/os/browsers you’re testing against?

Thanks for the reply! Explicitly setting float precision to highp doesn’t seem to have any effect, and setting it to lowp doesn’t break it either.

Apologies for referring to vague “devices”, The working device is a windows laptop running Windows 11 Pro(build 22631.3880), with an Intel core i7-8650U and Intel UHD Graphics 620

The other device with the problem is a laptop with Windows 10 home(version 22H2), with an intel core i7-9750H and a Nvidia Geforce GTX 1660 Ti(31.0.15.3598)

Most of my testing has been on Firefox(Version 120.0, 64-bit), but the issue is persistent in Chrome and edge.

It’s also worth noting that I’ve had a few friends test on their phones/macs, and none so far seem to have the expected behavior, so the expected behavior is definitely the exception rather than the rule.

Also, the full app is available to view at GPGPU Experimentation , though this has a lot more code than the minimal example in the original post.

Found the solution! Or at least a partial one. It looks like making the data texture that I was using to store the connections between the different elements a size that’s a power of 2 ended up working! I’m not entirely sure why this would cause a problem (and any insights would be appreciated!), so I’ll keep this thread up for a day or two before marking this as the solution in case anyone can explain.

1 Like