I am working with the FPS example of threejs official website and I tried uploading a glb model with the size of 60mb. But the browser got crashed and when the glb file size is less than 10mb, it loads fine.
What can be the solution to load a model around 50mb? Is there any option for that?
Most servers has a limit to size that you can upload. Can be found in the php.ini
upload_max_filesize = 10M
post_max_size = 10M
So if the FPS example of threejs is on the official website you can’t do anything larger than 10Mb.
EDIT: But if you running your own server you can change that in the php.ini
Or try to upload via FTP
Thanks for the reply,
But the glb file is already in my local server. I loaded that file using gltf loader. So then when i load the web page it crashes. Php ini can fix this issue? Its not that i upload something tho.
The size your server might let you upload is unrelated to what your browser can render. Moreover, size in MB is only one small part of what makes a model efficient to render.
Try running something like this on the model:
npm install --global @gltf-transform/cli
gltf-transform inspect model.glb --format md
And share what results you get? This will have more information than just size that would be important to how to optimize it.
Thank you very much. I will try. Can i know what this command actually does to the model? Will it give us detailed information about the model?
And may i know how to create an effective model? Are there any guidelines relate to the threejs randering when creating the model?
It does nothing to the model, just prints a bunch of stats that will help to figure out why your browser might be crashing.
It’s hard to give general advice on the appropriate size for a model, it depends, but I think the Khronos 3D Commerce guidelines (intended for e-commerce sites) are a reasonable starting point.
Copied a key section below:
- File Size: Ideally less than 5MB. Note that as glTF geometry and texture compression extensions, such as glTF Universal Textures using the KTX container and geometry compression using Draco, on become widely available, smaller assets or more visual fidelity at the same asset size will be possible.
- Draw calls: should be minimized by consolidating meshes, and using fewer materials.
- Triangle Count: 100,000 triangles or less.
- Texture Aspect Ratio: use power of 2 resolutions, square aspect ratio is not required.
- Texture Size: Use
1024*1024 (1K) or
2048*2048 (2K) for BaseColor, ORM and Emissive maps. 2K is recommended for Normal maps which are more sensitive to reduced resolutions than even Albedo maps. Normals Maps are also severely sensitive to JPG artifacts - a 2K JPG giving the same quality as same as 1K PNG normal map.
Very well understood. Thank you. But still i wonder how services like Artsteps launch huge exhibition halls on the browser with no crashing. I cannot imagine the size of those models arw less than 5mb tho.
Most 3D experiences are a big stack of tricks in a trenchcoat. Choosing the right tricks is the part that takes some experience. Size on disk has more to do with how quickly the thing loads, and very little to do with how efficiently it runs.
i have had projects where the designer would hand me a 100mb model and i made that into ~200kb without visual difference. i would not consider a model larger than 5mb and even that seems way too big. imagine, your 60mb model, someone with a mobile device sits in a train, after an hour the commute ends and the site will finally load. what kind of experience would this be?
compression usually starts with blender, edit mode > mesh > cleanup > merge vertices by distance, and then modifiers > decimate, for starters. then you have a lot of tools that handle mesh data (draco, meshopt) and textures (webp, squoosh, …).
though try typing this:
npx gltfjsx yourmodel.glb --transform
this will run a bunch of optimizations. how much did it take off your 60mb?
Hi, this is what is see after executing the command
gltf-transform inspect model.glb --format md
You have a good point. Do you have any other compression solution other than blender?
Noted sir! Thanks for the advise.
When i try to execute that command this error comes!?
Hm, this doesn’t seem like it should be 60 MB? There’s only about 14MB of geometry there and no textures at all. The vertex count could be reduced, 500,000 vertices is a lot. Decimating in Blender, or using gltfpack, would be the easiest ways to clean that part up. But I’m a bit surprised this model would be crashing your browser.
most gltf tools need node 16 unfortunately, i think squoosh is the culprit here. if you have nvm:
nvm install 16 // you need to do this only once
nvm use 16
npx gltfjsx model.glb --transform
If this is the exact model you’re trying to optimize I would not worry too much about the compression we’re talking about above, like Draco. That’ll reduce file size — which is good — but the number of vertices remains the same, and so whatever is crashing your browser is not going to be fixed by it. What is compressed must be decompressed before the browser can render it. The only likely problem i see in this model is too many vertices.
Yea i wonder too that this 14mb is also make the browser crashing. Can i upload my html and glb so you can take a look? Maybe as you said higher verties maybe the issue…
brought it down to ~1mb using the tool above loving-voice-0gx4d9 - CodeSandbox