Persistent Asset Storage - How to save models & textures to users machine 🖥

How To Store Reusable Assets On User Local Machine


Live example: here
Example code: here
Sample changes code: here (see below for explanation, browser support, and usage - mind, there’s a bit of reading.)

What? Why?

While creating a preloader for WhyDungeons, I wanted to mimic what Jagex did with RuneScape 2 - storing most valuable assets locally for faster load times and smaller bandwidth usage. Opening the game for the first time took significantly more time than any consecutive run (mind it was 2004, internet wasn’t powerful.)

Based on Can I user GLTFLoader to load file from the local file system of the browser, floppy_disk Off-line first , Is there anyway I can cache models in the user’s browser cache? and a few others on SO - nobody seems to have given a definitive example of how to store and load assets from local hard drive.

So after going through all the answers and MDNs, here’s the complete solution on how to safely store images / models / audio / textures locally with browser JS (not tested on node / workers.)

Limits & Compatibility

(see sources at the very bottom, if you care about credibility :’) )

General Support table: here

How much can I store?

“The maximum browser storage space is dynamic — it is based on your hard drive size. The global limit is calculated as 50% of free disk space. In Firefox, an internal browser tool called the Quota Manager keeps track of how much disk space each origin is using up, and deletes data if necessary.” ²

Mind the 50% (for Chrome up to 80%³) drive limit is for the entire browser. If your data gets stale, user stops visiting etc - your data will be removed from any persistent storage and needs to be re-downloaded.

Assume you can safely store up to 500MB-1GB of data locally, unless you target IE users (during testing I memory-leaked ~5GB and it didn’t even ask for permission on Chrome. :man_shrugging:)
You can tell exactly using:

await navigator.storage.estimate();

What happens if user does a hard-reload and clears cache?

Nothing - IndexedDB is independent and will not be cleared by a hard-reload.
Saved assets can be cleared using devtools, quota limits, or expiration.


What happens if my app exceeds local storage limits?

Nothing - you will just be unable to cache and proceed as usual.


Can I store only JSON / text data?

No⁴. IndexedDB does not care about data format - see all supported types here. (LocalForage uses WebSQL as a fallback - assume IndexedDB as a current standard, WebSQL is no longer supported³.)


Can I store cross-origin data?

No⁵. Data saved must be same origin and not “dirty” (ie. loaded via <img> instead of XHR.)


Code Explanation (& How To Implement Yourself)

(Quick note - code above is not yet 100% compatible with Three.js, otherwise I’d already be fighting with @mrdoob about merging revamped Cache and FileLoader to the main repo :’) IndexedDB is asynchronous, while ImageLoader.load requires a synchronous value returned. Also I didn’t test it enough to promise it will always work. :slight_smile: )

To enable local cache, add the following changes:

1. Allow to override default Cache behaviour

You can either inject LocalForage directly into Cache object (Cache is shared globally, so it’s enough to add it once.) Or do as is done here and add a method that allows to override default Cache behaviour and add LocalForage support (adjust other methods of Cache to work with override too - complete diff.)

2. Add asynchronous Cache to core loaders

Three core loaders can take advantage of caching - FileLoader, ImageLoader, and ImageBitmapLoader. To allow them to use async Cache, take the entire body of the load methods and place them in Cache.get callback (see here for example.)

Change all 3 loaders.

3. Enable Cache

Remember to enable Cache. You can now enjoy your locally stored assets :tada:

storage-use

Speed & Performance Comparison

First, if loading speed is all you care about and you don’t serve 100s of assets - it’s best not to bother with caching. Internet is fast these days, and load times will not change much (~1s difference with asset stored locally.)

If you (1) serve a lot of models and textures or (2) want to offload your server / CDN - do bother with cache. After you enable local storage for assets, users will simply stop sending requests for these assets:


(Out of ~40MB of assets, only 1.5MB is fetched from the server - only scripts and fonts.)

When To Avoid

If your assets change often (and are not versioned nicely), once the user downloads the model they will stop receiving any updates of it.
Stale data is a common problem with caching, so be sure to either invalidate old assets, when necessary, or use versioned URLs - otherwise what your users see may become significantly different over time from what you see.

Sources

6 Likes

Nice post.
I’m using localForage to store multiple saved state info of user progress. One thing I remember in my research is that safari now purges local storeage after a week.
…something to keep in mind.

3 Likes

@mjurczyk

Awesome work! Looks like some good improvements – this is interesting to me as I’m dealing with a similar issue and I have a few thoughts / questions if you don’t mind:

  • Assuming this is caching the response data from the request (which is what I believe is stored in THREE.Cache) I partly figured that browsers did this automatically as long as you set the Cache-Control header correctly. Have you tried that? Can you talk about how it compares?

  • Have you considered something like a service worker and the cache api? (I’m not extremely familiar with it myself)

  • Considering you’re using IndexedDB which supports ArrayBuffer storage have you considered storing the pre-parsed geometry attribute array buffers rather than the original request contents? You’d have to do some kind of light serialization of the rest of the object / materials as well but then you could bypass the overhead of parsing the full request, as well. Granted if you’re using GLTF those benefits may be less significant if any.

1 Like

I had some experimentation with this myself. I figured that it’s not worth writing a custom solution for caching static assets, instead browser does really good job in my experience. And overriding that and using up indexeddb quota was something I decided against.

I want to make sure there’s enough quota for things that actually matter to gameplay, such as game save data and storing user settings.

I have thought about storing generated data, things like pre-processed textures and geometry that would take a long time to generate otherwise. So far, with careful choice of data structures and algorithms - I didn’t run into the need to do such a thing. I have come close a few times, so I see value in that.

My biggest issue is lack of clear specification that would allow you to rely of a large quota of local IndexedDB being available. You’re only guaranteed a few Mb by the spec, if I’m not mistaken.

1 Like

I kinda hoped to get as much suspicious looks for it as possible. Thanks! :heart: :slight_smile: It’s absolutely not battle-tested yet, just all I found and combined, so don’t put it in any production app yet. The more doubts, experiences, and known limitations we can collect - imo the better.

(Just a quick disclaimer - this entire research happened to limit the need to connect to the server in any way. WhyDungeons uses AWS for asset hosting (there’s at least ~100-200 tiny assets streamed via a mix of EC2 & S3), and any asset that can be saved on the user machine is an asset I wouldn’t have to pay the transfer fee for.)

  1. First, I realised my biggest mistake. I shouldn’t have called it a “cache”. It’s persistent “storage” - caching can clean up whenever and whatever browser chooses to, IndexedDB seems to be a bit more predictable and persistent. Even with enabled cache, browser still sometimes sent requests to S3 for the same assets - with IndexedDB, it never did.
  2. @GlifTek - yeah, I read about Safari, but do you remember if it was after just a week, or a week of inactivity? The other option sounds reasonable, and as long as the user uses our app - they would keep assets stored locally. :thinking:
  3. @gkjohnson - yep, Cache-Control works very well with well configured CDN / CloudFront (I can’t configure neither very well, tho :’) ), requests never touch S3 / actual server then. I am torn between using headers and IndexedDB - while the first one is more mature and standardised, the second allows a bit easier to pick which assets I want to be stored (ex. save models & textures, but allow all the app scripts to be re-fetched every time user visits.)

Have you considered something like a service worker and the cache api? (I’m not extremely familiar with it myself)

  1. Curious what you mean, if you elaborate / share some links, I may just consider researching it for us to compare. :thinking:

Considering you’re using IndexedDB which supports ArrayBuffer storage have you considered storing the pre-parsed geometry attribute array buffers rather than the original request contents?

  1. Hm, same as 4, is there maybe a part of three source that does this geometry parsing you could link to? Right now it, indeed, stores just the responses from FileLoader / ImageLoader.

I want to make sure there’s enough quota for things that actually matter to gameplay, such as game save data and storing user settings.

  1. Considering point @GlifTek mentioned, that’s the part I’d actually save on the server, if I understand correctly. :thinking: Losing locally stored models will result in a bit longer loading, losing locally stored save data is a gamers personal tragedy.

My biggest issue is lack of clear specification that would allow you to rely of a large quota of local IndexedDB being available. You’re only guaranteed a few Mb by the spec, if I’m not mistaken.

  1. First, yes, I’d totally consider it as a fallback that allows for faster loading / fewer requests, if users browser allows that. As for the guaranteed quota, according to Google dev docs it’s up to 50% of the disc space. And as my memory leak tested, leaking 5-10GB of data didn’t even trigger any warning to the user that my app decided to take over the SSD :’) (on Firefox it at least shows you a warning after a few MBs.)
  1. Curious what you mean, if you elaborate / share some links, I may just consider researching it for us to compare. :thinking:

As far as I’m aware service workers are designed to enable offline use of a web application or at least support minimal content downloads on subsequent visits. Here are some links to the APIs I’m referring to though keep in mind I haven’t done anything with them myself:

  1. Hm, same as 4, is there maybe a part of three source that does this geometry parsing you could link to? Right now it, indeed, stores just the responses from FileLoader / ImageLoader.

Consider something like an OBJ file, which takes a long time to parse – it gets converted into a three.js buffer geometry with attributes that are backed by typed arrays and uploaded to the GPU. Rather than storing the whole ascii blob of OBJ data in the IndexedDB you could instead store those array buffers which would likely be substantially smaller than that the OBJ contents anyway. Then when loading the geometry from the IndexedDB cache again you can just stuff those array buffers into buffer attributes on a buffer geometry without having to go through the slow OBJ parsing step.

1 Like

Imo that’s just a statement without a source or meaningful input.

Yes, storages do have limits, but so do hard drives and caches - that shouldn’t stop us from using either :man_shrugging: (especially just as a backup, when available. And as seen above these limits allow plenty of space to be used.)

I’m diving into PWAs and ServiceWorker’s limitations @gkjohnson linked, and at some point MDN also suggests to use IndexedDB as a fallback for manual local Cache:

1 Like