Image size only half as large as shown in the image file

I have a phenomenon that I don’t understand. I have an image file 21600 x 10800

but when i read the width and the height of the picture from the canvas i only get 10800 x 5400

does anyone understand why I only get half of what the image file says it has?

const image = await new THREE.ImageLoader().loadAsync('./resources/image0.jpg');	
const map = utils.GetImageData(image);		

GetImageData: function(image) {
   let canvas = new OffscreenCanvas(image.width, image.height); 
   let context = canvas.getContext('2d'); 
   context.drawImage( image, 0, 0 );
   const data = context.getImageData(0, 0, image.width, image.height);		
   const width = image.width;
   const height = image.height;

   return {data, width, height};

why width is only 10800 and not 21600
why height is only 5400 and not 10800

A canvas can’t be arbitrarily large. Browsers have limitations and you probably hit them with your image.

Ah i wasn’t aware of that. Chrome has a limit of 32767px so my image file would be too big. Is there another way to store image pixels in an array without going through a canvas context?
An alternative would be to divide an image into 4 parts. Then i load each of them into an array via an offscreencanvas (i don’t need a normal canvas) and have the overall image in four arrays. But an array with all the pixels would be elegant. That would save me the case distinction when I have to switch between the 4 arrays

I’ve never worked with such large image data so I can’t recommend you something, sry.

Thanks anyway. The information that there is a limit saved me a lot of time and I then i don’t waste time of racking my brain about why it doesn’t work with 40k pixelsize images.