Loading Big .Obj file about 300 Mb will crash te browser and take much time to load

i want to load .obj file size is about 300Mb. when i m load that file it will crash the browser and some time it will take more time to load but after that it will say memory core dumped…

so there is any other way that i’m missing here for loading big object file and material file.

thank you in advance… :slight_smile:

You should try to use a binary format, using OBJ (text) means it’s 300 MB of text that will be parsed by the importer, which spawns even a lot more while this process.

Using a binary format will massively reduce the size and loading it will be almost instant (if the binary format is a internal one like glTF)

2 Likes

okey i’ll try with glTF and let u know if problem is not sloved. :slight_smile:

May also want to try simplifying it losslessly with https://github.com/jonnenauha/obj-simplify or lossily with Blender decimation. 300MB is not a small file no matter how you load it. :slight_smile:

2 Likes

hello … guys i have tried @Fyrestar advice and convert it to GlTF formate and then object is reduce from 300Mb to 180Mb but still not fits for web browsers to load that much data…

and @donmccurdy object-simplify is works fine with small size object but in my case when i am tring to simplify 300Mb object then it will take 4-5 hours and then reduce size to 198Mb …

so i am still searching in another feasible and better way to do this thing… :confounded:

What does it actually contain? It kinda sounds it’s even too much to display on average devices.

You need to provide a little more information, is it a big scene? In this case for example you could split the file spatially and load these chunks on demand.

If you could provide the file, i could do a test. I’m using a custom binary format which uses bare ready to use buffers and can compress. I don’t know how glTF is assembled, but only cut by not even half of a 300 MB text file sounds very unrealistic.

1 Like

hey i finally came up with new trick that save my day :slight_smile:

Original Object file Size : 300Mb

Process :
(1) obj-simplify advise by @donmccurdy.
it will reduce some memory and remove unnecessary stuffs.

so after 1’st Step i have 198Mb size of object

(2) now convert gzip format so will reduce to 50-70Mb :sunglasses:

(3) in this step we are going to decompress gzip file after xhr is download that file (note : here in this process of decompress we use pako.js )

(4) now we just load that decompressed data which is already stored in variable is going to be load in three.js using objLoader.parse(…)

(5) boom object loaded successfully :slight_smile:

1 Like

i also tried glTF but it will recude size to 170-190Mb from 300Mb and then i load into three.js but this not seems feasible for web browsers.

Are you doing any profiling to see how much ram is used?

no i an not doing profiling yet

I found that the amount of ram used on such big models might be over the maximum that the javascript engine in the browser might be able to handle.

1 Like

There is no limit in what it can handle, there is more likely a defined limit of the browser per process, this also depends on the machine. I hit almost 2GB without crash or anything in a test.

2 Likes

maybe you can caching the model to indexDB

I am really shocked right now: My 600MB file got minimized with draco compression (no parameters, just default compression) to 6MB… WOW! That’s incredible

1 Like