Is it possible to set a custom compression algorithm that will conform to GLTF spec for GLB files? Draco is from emscripten so it’s a pretty big dependency.
I did some local changes that compresses/decompresses all array buffers individually (all geometry attributes and not image ones) with Zlib, which only adds 7kb dependency but saves a lot. Not sure if the spec allows for such compression other than draco.
You could compress the entire GLB file before transmitting it over the network – zlib would be a reasonable choice there — and decompress it before parsing with GLTFLoader.prototype.parse
. Or Gzip is easier and can give good results too.
But the glTF spec itself does not support other compression choices within the file right now. There’s a proposal for a smaller/faster compression algorithm in the gltfpack tool. It won’t compress your data as much as Draco in some cases, but still looks promising, and compresses animation data where Draco does not. Because it’s not an official part of the glTF spec, you’ll need to use the custom version of GLTFLoader provided by that repository. If you try that route, I’d be curious to hear how your results are.
I compressed the whole thing originally but it was also compressing images in the file. This was causing a lot of cpu/memory usage when unpacking all the unnecessarily packed textures.
gltfpack doesn’t allow exporting from the browser but it’s a CLI tool
Hm, didn’t understand that you need to compress the model in the browser.
In that case you’ll need file-level compression. You could consider unpacking the GLB into separate files, so that you can compress the .gltf
and .bin
files but not the textures. But no, you can’t use arbitrary compression mechanisms and still have a valid glTF file. Or not without defining a custom extension and writing a custom GLTF loader to parse it.
that custom extension might be something I’m looking for! I’m going to read more about this topic