๐Ÿ‰ Anyone using Draco?

The compression size of draco is impressive, the decompression time is very long but ok for this file size. But i personally donโ€™t see a use of it outside enormous models like from 3D scans. It took ~7 seconds on my powerful desktop machine, on weak mobile devices it will take much longer. At least i donโ€™t see a use for games or the usual apps, but those for 3D scans of course.

At a certain triangle count an average device starts struggle rendering it anyway. This Cรฉsar model for example could be optimized that it would basically look the same, but only around 1-2 MB in uncompressed binary. Besides a lot unecessary faces the flat details on the fabric or armor will look even better/smoother baked in normals maps. The distortion of scans creates a lot wrong details.

1 Like

You can always optimize meshes, independently of compression. The demo should just show the difference between the original OBJ file (more or less raw geometry data) and a compressed glTF asset.

Interesting. The pure decompression takes only ca. 2 seconds on my four years old iMac with latest Chrome.

1 Like

Yes, and i agree with draco doing a great job, i just donโ€™t see a reason to generally use it since itโ€™s really rather relevant for big files. For the web one should always optimize the models first before only trying to compress it as much as possible, since triangle count for rendering is the more important point if itโ€™s about games and such.

I really like draco and will integrate it in my own fromat as well.

By optimizing scan models, baking details etc. mobile devices wonโ€™t struggle or die rendering it at the same quality. But that only applies to just rendering it in itโ€™s entireness, if there is a LOD rendering technique the hardware limit can be avoided, though it would require to store additional data which i think draco doesnโ€™t handle.

Iโ€™ve got a i7 5930k with 6x3501 MHz, 32 DDR4 RAM and TitanX with latest Chrome

1 Like

Compressing media data is used in many areas like pictures, video or audio. Nobody would come up with the idea to integrate uncompressed image data in a website (established formats like JPEG, PNG or GIF all use compression per default). From my point of view, there is no reason why 3D model data should be a special case. Even small files benefit from compression since the total delay (load+parse/decode) is in general smaller than loading uncompressed data.

2 Likes

Of course compression is relevant, but for average buffers other compression methods / faster ones suffice as well. And as said, optimizing the actual content since itโ€™s the most important part that will reduce the file size as well.

1 Like

how does it generally compare to gzipped json? Why does json gzip better than a binary mesh? On a high level, itโ€™s a different compression algorithm, while draco is specifically aimed for spatial data?

Theoretically, if draco decoder was part of the browser, there would be no question which format to use. But currently, since browsers can unzip natively, zipped json might win when both size and decoding time are considered?

I literally just started testing with DRACO today for my non-skinned meshes. The filesize reduction is absolutely nuts. 171kb => 9kb for example.

What I still need to benchmark is how much overhead the decompression adds and see if the faster download times are worth it. Will do that sometime this week.

3 Likes

Draco is spatially-aware and configurably lossy, so the compression ratio will be better than gzip for geometry-heavy models. Decompression does take longer, though, so that is a tradeoff. Related: https://github.com/mrdoob/three.js/pull/15249.

So I made this little tool.

https://www.titansoftime.com/draco.html

It seems for my purposes DRACO take a bit too long to decode. Since the assets get cached anyway, return users would have to deal with that decode time and not receive much benefit after initial load.

Seems like this would be most beneficial in a mostly non dynamic object viewer type situation with large mesh files, probably not best for games.

Oh well, still super cool. Legendary level compression.

1 Like

There seems to be a memory leak when using Draco.

When not using Draco, I reloaded my scene objects about 100 times and my memory went up by 0.4MB (some internal arrays).

I did the same using Draco and my memory usage went up by 17MB. The culprit here was the system ArrayBuffer. Does the DRACOLoader need to be disposed in some way?

This is reproducible in the link I posted above.

Edit: Seems it just shoots up to 25MB and stays there. Not a huge deal I guess.

image

About the memory use, see THREE.DRACOLoader.releaseDecoderModule(); (and https://github.com/google/draco/issues/349).

1 Like

Thanks for sharing the results.
Is this done using the Javascript or WebAssembly decoder ?

Right now, I am loading my scene as a zip file, and using THREE.ZipLoadingManager and it is cutting the file size down by about half, and it works great. The only thing I noticed is there is a second or two of really intense CPU activity, and everything freezes. Is draco any better, in that regard? If I understand correctly, draco wonโ€™t handle the texture data, only the mesh.

@Hyacinth Draco should give you significantly better compression on the geometry than ZIP compression. The decoding time is still nontrivial, but I donโ€™t know if it will be more or less in your case. You may want to try to keep your textures out of the ZIP archive, they donโ€™t usually tend to benefit from that sort of compression much.

We have a work-in-progress version of THREE.DRACOLoader that runs in a Web Worker, so (1) it can decode multiple models, or parts of a single model, in parallel, and (2) it doesnโ€™t block the main thread: https://github.com/mrdoob/three.js/pull/15249

You could also do your ZIP decompression in a web worker, but I donโ€™t have an example of that handy.

Finally, you may want to consider the new Basis texture compression library, which will significantly decrease the amount of time your application freezes during texture GPU upload.

1 Like

Wow, thank you. I will look into Draco then. And look forward to any new developments you guys make. Right now, the scenes I am loading are not very large. But I am modelling this world after OpenSim and Secondlife, where a region full of stuff can be up to a GB of data. So when the scenes start getting that large, I really need to think through loading it in stages, so your base scene loads first, so you can walk around. Then it overlays details on top of it without really freezing up. Though I think pressuring people to make better design choices will help too. :slight_smile:

Hi @Mugen87,

Thanks for your suggestion, I am new with glTF Loader and also try to reduce size of my model using gltf-pipeline.
My original file is 43.4MB, but when I do:
gltf-pipeline -i MyModel.glb -o modelDraco.gltf -d
and the new file is made with 51.3MB in size.

The other option to do is:
gltf-pipeline -i MyModel.glb -o model.gltf
But the new file is 57.9MB in size.

Am I doing anything wrong? :smiley:

Thanks,

Youโ€™ll want to keep it as a binary .glb file.

gltf-pipeline -i MyModel.glb -o modelDraco.glb -d
1 Like

Wow, great!
Thanks @looeee
Now I got 38.4MB in size. happy_face
But from 43.4MB down to 38.4MB, is it possible to reduce more? :smiley:

use

gltf-pipeline -i MyModel.glb -o modelDraco.glb -d --draco.quantizePositionBits 11 --draco.quantizeNormalBits 8 --draco.quantizeTexcoordBits 10 --draco.quantizeColorBits 8 --draco.quantizeGenericBits 8

Those are the new default quantize settings from draco.
If you still dont get favorable compression, try loading your model into blender and export it in blender to a glb file with compression using blenderโ€™s default compression settings.

thanks for the Blender tip! My draco compression using gltf-pipeline resulted in clearly visible losses. Blender default compression settings worked well and I got a 16% compression