I have a project with an uncontrolled source of models that are loaded to the scene (don’t ask why it’s a given) Now I want to prevent the case when some heavy model will get uploaded and it will cause performance issues on the scene and make the whole scene unusable. (too many polys, too heavy texture, to many details).
What are some possible ways to prevent it?
I can limit the upload to a certain model extension. For example glTF with compression.
Ideally, I’d like all models to be low poly, is there a way to check it?
I can definitely check the size of the file/ amount of objects in the model once uploaded?
What else can be done?
You can figure out how complex is model from it’s vertices data
file size, vertex count, you could auto-generate this in node quite easily. but eventually it depends on the machine as well. there’s this project GitHub - pmndrs/detect-gpu: Classifies GPUs based on their 3D rendering benchmark score allowing the developer to provide sensible default settings for graphically intensive applications. it could be key in solving such issues from the ground up. though i wish it would receive more help from the webgl community to keep it up to date or crack vendor obfuscation. i have used it before with pretty good aim. but alas, it thinks my m1 is tier 1 fallback thanks to apple hiding gpu readout.