How to use blob images as texture

I want to load multiple images as a texture without disturbing the main thread. I use a worker in which I load the images as a blob to be able to send them back to the main thread. but how can i use the blob images as a texture to be able to transfer them to a shader?
To do this, I have to convert the blob images back into images in order to be able to use them as textures. The question is how do I convert blob images to textures?

//worker code

async function LoadImages(urlList){
				
   const blobImages = await Promise.all(
      urlList.map(async (url) => {		
							
         const response = await fetch(url);                       
         const fileBlob = await response.blob();
         const image = URL.createObjectURL(fileBlob);

         return {image};
      })
   );			

   return blobImages;		
}
	
	
self.onmessage = async (msg) => {

   const urls = msg.data.urlList;
   const result = await LoadImages(urls);
		
   self.postMessage(result);
};
//here i receive the blob images from the worker

let blobImages;
	
worker.onmessage = (e) => {			
   blobImages = e.data;	
};

//now i need to convert the blobImages to textures to send them to shader uniforms, but how to convert them?

Wouldn’t you just use TextureLoader with the data url?

The texturLoader does not work in the worker. There is probably a connection to the dom. But that’s not my problem. An htmlImage can easily be used as a texture with canvasTexture(). I just don’t know how to get my image from the blob imageUrl. I find many examples in which it is about making the blob and the urlobject but so far nothing where I can really understand the reversal to an image file

URL.createObjectURL returns a url string not the images, a url to access/load the file like a external resource, you can use this URL with texture loader or in case of a image:

const image = new Image;
image.src = URL.createObjectURL(fileBlob);

Also loading resources is always asynchronous there isn’t really a point in doing this in a worker unless you’re processing/decoding something. If you experience laggs on the main thread it’s likely a different reason like the upload of large textures or textures not in size power of 2 causing a resize before upload.

1 Like

Ah, does that mean that my worker loads the file and then only uses the url to refer to where the loaded file is located in the ram so that the loaded file can then be accessed in the main with the url?

PS
I do a lot of decoding, thats why I use worker.
However, now I just want to load an image file at some point (depending on the case) at runtime and pass it to a shader without disturbing the main thread. If you have an elegant solution I’m looking forward to an example. This point is still one of the few big ones on my once long list.

The decoding thing got me thinking. If I already have image data in a decoded form in an array, there must be a way to transfer this to a shader as a texture. When loading a texture, the texture loader must also read the image data so that the shader can understand it. I have to think of a DataTexture

const texture = new THREE.DataTexture( data, width, height );

Since I already have the data, width and height because I decoded it for something else in the worker, I would save myself unnecessary work if I could put it directly into a shader as a texture. For me that sounds like a very resource-saving way at the moment. I have no insight into the effort DataTexture has to combine data, width, height into one texture, but I imagine that DataTexture is just a shell that combines these three components. Or could I even pass a data array from a worker directly to the gpu and then tell a shader? This topic makes me very curious