KTX2 compression artifacts

I’m trying to build a floorplan viewer where the floors and it’s layers are represented by large images. I get these images as .pngs and the largest ones are close to 30k in resolution.

You can then draw and place 3d stuff on top of these images hence why I use three.js

The best solution I’ve come up with so far is to chunk these large images up into smaller pieces, convert them to KTX2, and then tile them. Performance is great - both memory usage and load times are very low.

Unfortunately when I convert them from PNG → KTX2 I get some artifacts that I don’t think is acceptable for my use-case. Not sure if it’s something I can tune in three.js via some texture or renderer settings or if I can improve the parameters for toktx.

These are the parameters I’ve had most success with so far:

toktx --2d --genmipmap --target_type RGBA --t2 --encode etc1s --clevel 5 --qlevel 255 --assign_oetf srgb

I’ve setup a simple demo to showcase the difference in quality here:


You can also check it out locally here: GitHub - filipGG/ktx2-png-compare

Any help would tips would be appriciated!

Found this artist guide:


And by following their ultra high quality reference I got much better results, with slightly larger file size and memory usage:

uastchq=--t2 --encode uastc --uastc_quality 4 --uastc_rdo_l .25 --uastc_rdo_d 65536 --zcmp 22

Those crisp vector-based graphics can be hard to compress with ETC1S especially, I think switching to UASTC is a good tradeoff here, as you found. Results could be improved (with either) when hard edges in the texture are aligned to a 4-pixel or perhaps a 2-pixel grid: both formats are block-based compression based on 4x4 block sizes. But you’d want to experiment with that, and obviously it’s a fair bit of extra work depending on the overall art workflow.