Survey: Which is your primary 3D format for three.js apps


#1

I’ve wanted to test the forum’s poll feature since quite a while and I think this topic is a good opportunity :grin:. You can make exactly one choice. The poll will automatically end in one week (on June 11). Thanks to everyone who participate!

  • OBJ
  • Collada
  • FBX
  • glTF
  • JSON (three.js)
  • something else

0 voters


#2

#3

After the poll can we discuss these file formats further?
I’d like to here about the pro(s) and con(s) of these file formats.

Also if their is a mindset shift in the community because it is 2018 not 2000 using a particular file format for another based on a particular use case(s)?


#4

If i wouldn’t need full flexibility, i sure would use glTF, but no format that isn’t binary.With a custom i always know what’s going on and can store various kind of data that might be not even for the model directly (like a spatial index).

At least with the veteran formats like FBX and OBJ i experienced a lot of issues very depending on the modelling software, especially FBX.


#5

Well, glTF comes in binary and non-binary flavours. It’s a bit unfortunate since they both tend to be called glTF however the binary format is actually .glb. The specs are here.

I think that the number of people using non-binary glTF is going to be pretty low, so whenever you hear someone say glTF you should assume that they mean the compressed binary version.


#6

Just to clarify, the buffers are always represented as binary data no matter if you use gltf or glb. A storage within the gltf file as base64-encoded data is possible.


#7

New file format in the wild: USDZ related to augmented reality, unrelated to Three.js


#8

I think nothing can beat the native JSON format due its ease to create, maintain and manipulate the file.
Yes, file size can increase a bit, but, gzip compression is always recommended anyways.

The biggest benefit is the material assignment. I don’t ever need to worry about material hacking within the app itself to make it look as I wanted. The modelers always create the models as close to what it should look like with whatever tools they need. But when it comes to replicating the same material effect with files other than JSON, you alway end up tweaking the material properties within you application.

It is always great to have tools like threejs editor. So, wherever there’s a 3d asset created, I recommend to use editor for material and texture creation. There’s a lot to be improved in editor, but hey, you can always help the community by creating you own features and releasing it as PR or keep it to you as per your requirement.

Open for feedback on my point of view.


#9

Related:


#10

I rather meant any format that isn’t binary, especially JSON or XML formarts only cause performance hiccups since they are huge and require parsing. I mostly load models on demand at runtime, it’s a different case if everything is loaded on start, though it’s still unnecessary i think, also for filesize.

The custom one i use is rather universal purpose, since it’s just a json header with binary attachments, and manages the byte padding for direct use. A lot binary formats are dedicated to the specific model format describing the structure in binary as well. That’s what often causes incompatibilities and basically no (binary) format that entirely works from my modelling tools.

Looking into Collada this is rather carzy how many symbols you have to use and lookup references just in order to reconstruct it, not to mention the amount of memory allocated for this process. I think some loaders are better suited for converting files instead using in production.


#11

@Fyrestar Just out of curiosity: glTF offers an extensions mechanism in order to extend the core format with user specific features. There is for example MSFT_lod which allows the specification of Levels of Detail (LOD) to a glTF asset. Have you considered to use this mechanism for your own enhancements (you mentioned you are using spatial indices)? Or do you think glTF extensions are too limited for your use cases? Any opinions?


#12

I’ve only used JSON because it really helped me understand where my mistakes were made, since you could look “under the hood” at the raw data. It’s the only format I’ve used, and once gzipped, the data always ended up not being very large for transfer. However, I’m excited to try glTF on my next project, and see what performance benefits I’ll get from using the new format!


#13

I find JSON to be a really good format because it’s just so prevalent on the web. It gzips pretty well.

Oddly, i got smaller json meshes some times than binary files when compressed.

When i load stuff for my custom subdivision, i need ngons, so i convert OBJ to JSON.


#14

It’s nice it provides this feature since it’s important, but i generally prefer to be fully flexible and have control over it. Another general advantage is the most important first stage of security that prevents the most from stealing assets. With every common format including glTF now, you only have to open the console and download the file.

With a custom format you have to put much more effort into making use of the assets and turning them into a common format again.


#16

So far as reducing the final file size goes as well as going for a pipeline that suits by needs, the sea3D format has given improvements compared with the other formats I tried to replace the JSON files coming out of the Blender exporter.

I use Maya as my main software, so I just have to export a collada to open it in the sea3D editor where I set all my bones animations, textures and anim sequences.

have a look at this gthub thread and the official sea3D website if you are interrested:



#17