Is the usage of Content Delivery Networks (CDN) for deployment recommended/discouraged?

The usage of importmaps like the following

<script type="importmap">
  {
    "imports": {
      "three": "https://unpkg.com/three@0.172.0/build/three.module.js",
      "three/addons/": "https://unpkg.com/three@0.172.0/examples/jsm/"
    }
  }
</script>

has developped into a handy habit for me during development, and lately also for deployment of finished projects onto my private, nonprofit blog. It allows for easy switching between releases and it works well - until it doesn’t!

I’m currently running into non-availability issues because unpkg.com seems not to be available - at least it’s not responding within the Safari timeout of 2 minutes. Which inherently leads to all my Three.js related projects simultaneously breaking down on my website :sob:

So I’m wondering, if the usage of CDNs for deployment is such a good idea after all? :thinking:

Any thoughts/advice on this?

2 Likes

If i find myself needing to load something from one of those CDNs, i just download it the first time, then in network tab of chrome, “Save as” the library right into my app.

The cdn promise of distributed loading and paralellism is a lie. They always load slow… and have longer startup times. If its local… all your file io is going to perform consistently.

6 Likes

For quite some time unpkg.com has been degrading in terms of response time and availability. I used to use it in the past, but now I use jsdelivr.net. No issues so far.

<script type="importmap">
{
    "imports": {
         "three": "https://cdn.jsdelivr.net/npm/three@0.174.0/build/three.module.js",
         "three/addons/": "https://cdn.jsdelivr.net/npm/three@0.174.0/examples/jsm/"
    }
}
</script>

In some cases, however, I prefer to put the files locally, but this happens quite rarely nowadays. As for bundling - I dislike it on some irrational level. I understand its benefits, but I still cannot start to like it.

6 Likes

Thanks for recommending this alternative :+1: which worked for me instantly.

It seems like basically, when using a CDN, you’re trading convenience for an additional chance of failure.

1 Like

Thanks also for your account. I’ll probably revert to deploying the needed libraries locally on my own webspace.

1 Like

As good of an idea as gambling is :see_no_evil_monkey: since you’re already serving your project JS, kinda no reason to rely on third party CDNs instead of just using vite and just bundle dependencies - you’re accepting unnecessary risk at little-to-no gain.

5 Likes

I was suspicious of CDNs right from the start. If they stop working, all examples are no longer immediately executable.

I therefore decided back in 2017 to save the required sources of three.js in the respective revision locally for my collection. Likewise other resources. Of course, this takes some work.

If you download the zip file of a vintage of the collection, you can be sure that all examples will work permanently. (The basic examples, the extended ones are only links, and mostly with the use of CDNs).
Unless the browsers have new requirements again, as was the case with older examples with video texture. They no longer work because user interaction is now required.

2 Likes

Likewise for a long time, then it made sense, npm i three@re.vis.ion saves opening a browser, finding a repo and downloading it, it’s all in the terminal with 100% local control of directories

Got the same issue while updating a GitHub repo. It seems to be resolved for now, but migrating from unpkg to jsDelivr is a possible alternative.

1 Like

And gambling on the persistence of a service that has no obvious business model other than “give away bandwidth for free until…?”