Hello all,
I’m currently making a small app and one of the components parses a .obj file, stores the position of each vertex in a texture, and recreates it using line elements that get update their positions based on the data in the texture.
The problem, however, is that while this works exactly as expected on my laptop, testing on other devices yields significantly different results, and I’m not entirely sure what the root cause is.
I’ve created a small example of the problem on CodeSandbox.io, and pictures can be found below of what the default bunny model looks like on my computer vs what I see on another device.
Output on device 1:
Output on device 2:
Any help would be appreciated!
Also a note for why I’m loading the geometry this way, as a small project to try out GPGPU, I made a script that simulates the obj as if it were made of one-dimensional springs, and each node is computed independently using GPGPU
EDIT:
Additional Information:
The working device is a windows laptop running Windows 11 Pro(build 22631.3880), with an Intel core i7-8650U and Intel UHD Graphics 620
The other device with the problem is a laptop with Windows 10 home(version 22H2), with an intel core i7-9750H and a Nvidia Geforce GTX 1660 Ti(31.0.15.3598)
Most of my testing has been on Firefox(Version 120.0, 64-bit), but the issue is persistent in Chrome and edge.
It’s also worth noting that I’ve had a few friends test on their phones/macs, and none so far seem to have the expected behavior, so the expected behavior is definitely the exception rather than the rule.
Using Three.js version 167