I am creating an application using threejs where user can upload many stl models but they can see only one at a time in the viewer. I am controlling visibility of models by setting object3D.visible to true or false. Let’s say user uploaded 10 parts of 100 MB each and he is viewing only one model , remaining 9 are hidden.
I want to know what is the maximum memory limit renderer/scene can hold ? I guess memory limit per tab in chrome browser is 4 GB. I want to track how much memory my renderer is holding , if it is exceeding my threshold limit (lets say 700 MB) , I want to delete previously loaded hidden objects from the scene.
Can anyone help how to achieve this ?
But in webgl renderer info I am getting number of geometries and textures. It doesn’t have the size of geometry. I want to know the size of objects in MB.
OK, you won’t find that there then, the file size may be hidden due to security reasons in js, although I’m not 100% on that, what you may be able to do is get the object3d’s blob reference and use .size maybe?
I created Blob reference like this ,
var Bb = new Blob([object], {type:“object”})
And I get size as 15 . I guess unit is in byte . I uploaded 11 MB file but I get only 15 bytes here.
I am thinking of trying following approach. get position and normal by using geometry.attributes.position & geometry.attributes.normal. position gives number of vertices and normal gives normals.
I am assuming one pixel is required to display one vertex. in order to display one pixel , system will take 24 bytes. we are not displaying normals , system holds this as float32array. so this contains 4 bytes per value.
And is deleting the objects from scene the only way to handle performance? literally , I want to keep the the invisible objects in the scene itself. because if I delete and load the part again using loader , it takes time to load and I don’t want to wait. Even though I make previous parts invisible while loading new part , I can see , memory of my application is increasing while taking heap snapshot every time.
If there is a way I can stop my application counting the memory of invisible parts as well , it would be really helpful. If there is no way I will go with deleting the object.
If your objects are loaded and cached in the browsers vram you need to “garbage collect” dispose() of objects and textures to have them unloaded from memory, think of it like filling a plane with steel mid flight, if the plane is too heavy, it will eventually crash, if you want to put another resource, say gold, you need to take off some steel to fit the gold on as well as not be overweight.
What file format are the models? You have the option to use draco compression if they are gltf format.
Otherwise, I don’t know what the outcome would be but, you could potentially save the geometry.positions and normals of each object as their own array variables and dynamically rebuild the geo when needed, I’m not sure if that would save memory or not though…
I’m using stl files. I cannot change the file format. I wonder then how browser based games are managing memory where it needs to handle more 3d objects and animations . I guess memory limit per tab in chrome is 4 gb . Definitely games will cross this limit . What will happen if threejs gaming page crosses 4 gb ? Will it crash ?
Can you please tell me how can I recreate the object if I store geometry.positions and geometry.normal ?
Oof, I’ve had an iPhone gpu completely die with an early test scene I made by accidently pumping too much geo and 4k textures into the viewport, being said it was an old iPhone 6s but was 100% the scene that sent it to its grave.
I think with pre compiled, closed circuit systems like unity and unreal, there’s a level of optimisation, baking both reflections and light maps, ( consciously making Web ready assets, optimised geo / textures ) you can get away with
the illusion of a lot more effects and objects in a scene, you can achieve very similar results with THREE if you apply similar rules to both your code formatting, model / texture optimization and using instancing where possible. If you are merely chucking huge unoptimized models into your scene, you are always going to be limited as to what you can do.
Reguarding rebuilding… This was a random thought and may not be a lead at all, this would involve loading all your models, storing the position and normal attributes to an array of js objects, [{type: obj1, positions: floatArray, normals: floatArray}, etc…] and then feed each “type” to a function as a callback that builds a new buffer geometry from the “types” positions and normals, applying a material and having the function return this resulting mesh as an object to be added / removed from the scene where needed. Again, I don’t know much, this is a theory and may be useless!
In the case of standard 3d objects, you can make them appear and disappear by adding or removing them from a visible parent. They are retained in memory, but are not part of the scene.
However, this option does not appear to be available with a loaded gltf file which, I believe, is saved as a “group”. You might be able to isolate individual objects in your glft file and “repackage” them as 3d objects which could can add and remove. But I have not tried that so I don’t know for sure.
And I am not sure if, from a performance standpoint, the add/remove option is significantly better (or worse) than using the visible flag.
Textures are often the part of a three.js app that use the most memory. It’s important to understand that in general , textures take width * height * 4 * 1.33 bytes of memory.