When rendering directly to canvas, using renderer.antialias = true; gives you nice results. However, when rendering to a WebGLRenderTarget, you lose this built-in antialiasing mechanism.
My question is, what anti-aliasing approach does the browser use to achieve the built-in AA result? Is it supersampling somehow? I’m trying to re-create it with a post-processing pass, but FXAA doesn’t look as good, and a MultisampleRenderTargetis only available in WebGL2.
Basically just this, MSAA. The MultisampleRenderTarget makes it available for render targets as you see it on canvas without postprocessing. But it depends on the hardware and the settings.
FXAA is the best option for performance and quality still. Other AA techniques are more expensive but you can give them a try if FXAA doesn’t fit your needs.
Yes, but actually no. MSAA is more efficient rendering subpixel only at edges of triangles but it also isn’t guaranteed that it will be applied, nor which level - it does look prettier but some of your users hardware might not provide it, and then there is no AA at all.
Supersampling will give you a flawless result with terrifying performance since you’re going to render at 4k for a regular 1080p screen.
FXAA is very cheap compared to all other options and also one for mobile. The example scene you linked also isn’t exactly the best case scenario, in actual scenes with less heavy contrast and diagonal edges it becomes much less visible.
Thanks for your answer! I guess there’s no way for me to re-create the MSAA in WebGL1, so I’ll stick with FXAA for now.
I was wondering, is there a way to get higher quality with the shader code in the repo? I tried changing the FXAA_QUALITY_PRESET in line 84 from 12 to 39 as suggested in the comments, but I got no noticeable visual improvement. I also saw in the notes that lots of quality features only apply to 360, PS3 or PC… is that talking about XBox 360 and PlayStation?
I was reffering to MultisampleRenderTarget sorry, what the hardware does by default can be supersample or probably FXAA and others too (depending on the settings, but most GPUs support it and at least from my experience it is standard unless it’s bad onboard graphics or a notebook with nvidia suboptimus in these cases there is usually no AA at all.
If AA is important and you don’t want to rely on the client system it might be better to add it manually maybe as an option for the game or app, but getting no AA at all without high dpi screen can ruin it.
Ok, that was confusing. I thought your answer to my original question was that renderer.antialias = true uses MSAA. We already knew MultisampleRT uses MSAA, it’s right there in the title !
So just to clarify, antialias = true could give you a wide range of approaches depending on the graphics card?
Yes i had no sleep sorry, after quick reading thought you meant the MSRT for some reason.
From the spec:
antialias
If the value is true and the implementation supports antialiasing the drawing buffer will perform antialiasing using its choice of technique (multisample/supersample) and quality. If the value is false or the implementation does not support antialiasing, no antialiasing is performed.
Notebooks and office desktops are usually most critical in that regard, notebooks try to agressively save battery with optimus and other power options (disables gpu for browsers usually) and office machines (at least cheap or old ones) don’t come with a card or just poor or unsupported integrated graphics.