TextureLoader "random" CORS issue (AWS S3, NOT on Azure Blob Storage)

Going slowly nuts so my last resort here.

I have a plain JPG on a S3 bucket with all permissions. I load this remote texture via TextureLoader.

Randomly I receive this error:

Access to image at 'https://xxxxxxxx.s3.ap-southeast-1.amazonaws.com/xxxxxxx/ape.jpg' from origin 'https://aaa.bbb.ccc' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.

When I refresh the page (typically in Chrome with DevTools open) , the error disappears. It only occurs when the page is running on a remote server with https. On localhost (http) with JPG in S3, it never occurs…

I played with loader.setCrossOrigin(‘anonymous’); etc but no diff.

I am lost. What can be the reason? FYI: only images have this issue. GLB, MP4, HDR files all load fine and reside on same S3 bucket all with same permissions.

Open dev tools usually disables the browser cache.

Can you reproduce the issue when opening the page in incognito mode? Does it also happen with different browsers? Try it with a fresh Firefox installation. Can you also reproduce with Chrome from a different computer?

Thanks for getting back to me !

Chrome and Edge without DevTools open always result in above CORS error.
With DevTools open it always works!

The exact same behavior occurs in incognito mode.

In Firefox it always works, never CORS.

In case useful: these are the headers AWS S3 adds to the image:

For anyone who comes across the same issue: we decided to move to Azure storage where the problem does not occur! All we needed to define is some CORS settings in Azure (see bottom image).

In AWS , I cannot set ExposeHeaders to * like in Azure and frankly I have no idea what to put here. besides that, why GLB are OK and jpg are not? I cannot imagine it’s because of this. But showing it anyways.

I welcome anyone who can shine a light on this issue as in some cases we might be forced to use AWS.

image

Just a stupid question.
Is the texture being used twice … I mean, like show for thumbnail as UI and in canvas for model as well?

It is !! Please tell me this is the reason!

Sorry! I don’t know the reason why it happens …

And we ended up using two different texture, one for thumbnail and one for model use case …we used “Spark” library to compress and change format for our thumbnail when we upload the texture to s3.

Here is another method you can try… I tested it, and it seems to work, but I tested it recently so I can’t actually tell it’s a full proof solution.

You can assign this as source for thumbnail:

image

Do let us know if it worked for you.

Yes, your trick worked as it will never use the cached image for the texture when thumbnail URL is made unique in this manner. Thanks so much for your solution.

It means that the texture loader will internally fetch the (internal) cache URL and this has issues obviously. DevTools open disables cache by default so that explains that.

1 Like

Lol jesus man, I just fixes similar problem with AZ Storage 5 minutes before, but my problem was I had no “http://” prefix

this explains it

I had a similar issue loading from GCP storage bucket.

I set up CORS for the storage bucket and it fixed it!
This Doc Reference walked me through it How do I set up CORS for my Google Cloud Storage Bucket?

I’m sure there would be a similar workflow using AWS.
I hope this helps!