Hi, everyone
I tried to use unreal blooming effects in Web VR, but I can’t see its effect in my oculus quest 2 headset.
I used to use a lot of postprocessing in threejs projects. but this is my first time to use them in VR.
Please let me know if there are any nice methods to make post-processing possible in WebVR scene.
Thank you all.
EffectComposer
does not support WebXR right now since its multi-pass approach does not perform well in XR. The idea was to implement a new system which is able to merge multiple effects in a single (uber)shader. More information at GitHub:
@Mugen87 Thank you for your answer. I hope that the latest version of threejs will support these post-processing effects soon.
@Mugen87 Can we possibly use other buttons of VR controller like A, B, X, Y buttons or joysticks (Oculus Quest 2) in threejs? I don’t see any examples for them.
as a user coming to use XR in threejs for the first time, my expectation has been that my code should work the same across XR and non-XR
the glitch effect (from examples) performs well on my android using EffectComposer and from what I’ve seen, performance on mobile in and out of XR is comparable.
imo, if code is not performant, it should be up to the user to decide if to use it or not or to find an alternative or choose which devices to support etc.
i imagine others like myself that are familiarising themselves but following the three.js examples and would expect for the EffectComposer to work.
from my understanding GitHub - pmndrs/postprocessing: A post processing library that provides the means to implement image filter effects for three.js. should work in VR.
postpro already combines all shaders into one uber shader so there are less performance concerns. the next release will establish a new MRT pipeline. imo there is little to no reason to use three/jsm/effects over postpro, the latter is in active maintenance and development.
I’ve tried so many times to import postrpro with the current three.js import conventions with no luck, is there a way to do this using import maps or is it even possible? I always find myself disheartened that such powerful library addons such as postprocessing do not seem to be wholesomely inclusive in this way… Is it a strict and exclusive requirement that we use a bundler to use this?
In general, packages on npm are meant to be installed through npm — that they get mirrored onto third-party CDNs is not controlled by library authors or npm, and three.js is unusual for doing so much to support importing through CDNs. Personally I just publish libraries in an npm-compatible format, CommonJS and/or ESM, and leave it to users to build from source or modify existing builds if they are using a setup that can’t resolve npm package names. An import map setup breaks down pretty fast if your dependencies have dependencies with dependencies…
That said, if you want to use postprocessing with an import map I think you’d need to select an ESM entrypoint manually. Most CDNs can’t handle this for you. So for example the postprocessing
name should resolve to “https://unpkg.com/postprocessing@6.29.3/build/postprocessing.esm.js” rather than the default UMD path unpkg would give you.
EDIT: Alternatively, it looks like the https://esm.sh/ CDN handles this better and gives you an ESM entrypoint automatically.
That makes a lot of sense, thank you for such an insightful answer overall.
my assumption was that this was possibly by design (?) as to not only accommodate the vast spectrum and variation of user implementation but also to possibly foresee future browser specification progressions… In saying and assuming that I am merely going on three’s advanced nature in terms of it’s progressive infrastructure for accessing webgl.
Thank you for linking this, that’s so helpful, I’ll give this a try ASAP, it looks promising!
I don’t get the sense that browser vendors are going to provide real alternatives to bundlers. I think the choice by three.js mostly has to do with making three.js more accessible to 3D artists, generative artists, hobbyists, students, datavis specialists, and others who are not necessarily working in today’s JavaScript ecosystem full-time. three.js tends to be a “gateway” for people from different backgrounds in a way that most JS libraries are not, and benefits from that diversity of experience and interests. Maybe the idea is to be a little more welcoming than most JS libraries, as a result.
It is a hard balance to find, though, especially as more dependencies get involved. I (personally) still tend to advise using something like Vite, the learning curve is lower than other bundlers.
esm with import maps seems a little bothersome to create without tools, but it’s also not that easy for library authors. i wouldn’t expect many packages to navigate the export field correctly. just yesterday codesandbox broke trying to make exports work, of all people andarist working on it, how does an author even stand a chance.
vite and esbuild imo are the best chance for esm to succeed, being purely esm/http2 it helps the spec. and it can also prepare libraries for publish, the hope is that tools will make the spec serviceable. unfortunately atm not even vite can produce a usable outcome. we’ve tried (pmndrs/meshline) and there are always environments where importing fails. i’m not even sure if it’s possible without disturbing large parts of your userbase.
ps, here’s an overview from tobias koppers, the webpack author detailing the minefiled that is modern publish, thanks to esm it has become unbridled chaos: https://twitter.com/wSokra/status/1329088911528714244 scrolling through that i feel so sorry them them.
the situation is so dire that library authors often cannot move on as it would make their library incompatible with most if not all of their userbase. this is why people are loosing faith when it comes to the w3c and the specs process. and the only hope that this won’t turn out worse is, again, tooling.
the webpack author detailing the minefiled that is modern publish, thanks to esm it has become unbridled chaos…
I don’t think you mean to blame ESM here, right? the problem is that interop between incompatible module systems is really hard. That transition was never going to be easy, but I’m mostly frustrated that it has been dragged out for 8 years. It’s long overdue for CommonJS to start going away. In the meantime, yeah, tooling does its best.
i shouldn’t discuss inflammatory issues, i have hot blood
in my opinion there is definitively some blame that falls towards esm, but probably most. i remember some of the node devs coming out of the committee meetings utterly destroyed because their concerns were brushed off. the committee didn’t think it necessary to anticipate migration and the decision that cjs and esm can’t be mixed is why esm is failing.
all in all, i don’t think javascript devs can be blamed. if the committee really thought individual devs that barely scrape by will simply go esm, which means that 99% of their userbase can’t use their libs any longer and will fork, they were clearly mistaken. all the blame should be on the people that thought they can get this done by coercion.
Back to the topic, are there news on postprocessing with xr?
I load them just like everything else… three/addons/postprocessing/Effects etc etc
Works fine…
However loading from cdn can be tricky bc each cdn seems to have different ways of hosting the addons if at all
Hi, I’ve created a WebXR demo here:
https://www.beemsoft.nl/vr/src/demo/webxr_vr_postprocessing_unreal_bloom/index.html
It is based on the original Three.JS demo:
https://threejs.org/examples/webgl_postprocessing_unreal_bloom.html
Demo code is here: