I’m exporting a .json file which is huge in size (~80MB), and the loading time is very long.
This file contains an entire scene (meshes, geometries, materials, etc). And I was wondering if I could save that json file with smaller size using some compression method. I found that Draco is very recommended, but in my case, I tried to export it to .drcobj (since I’m using .json files) and the results were not very significants (~75MB).
What other kind of compression methods should I try in mi case?
No, I just exported the .json file with DrcobjExporter(), but since the problem persisted, (the resulting file has a similar size) I decided to look for other options.
Can I just zip the json file at the time I export it, and when loading this json compressed, unzipped before loading? Which is the method?
Large amounts of vertex data in JSON is not ideal – a binary format can be significantly smaller and more efficient to parse. For example, any binary data (e.g. the Draco-compressed payload) will likely end up in a Base64-like format that costs extra time to parse at runtime. It looks like the drcobj library you linked to is converting the JSON to a binary format after compression, so that’s good.
I’d be a bit surprised if gzip reduces the size here — Draco compression is compact already and doesn’t generally gzip much further. And the drcobj library seems to have zlib built in (the isDeflate option), which would likely be more efficient and more convenient than JSZip in this case.