I’m fairly new to Three.js so I apologise for perhaps raising a question that has been already answered and/or if the solution does not exist for obvious reasons that beginner level developers just dont know. Yet.
I want to create a landscape of video-objects that can be found through audible hints. However, none of the abovementioned examples seem to work on the phone.
Does someone know if it is even plausible, given that earphones would be required from the user?
The sandbox example works on my Pixel (1) with Chrome. The webaudio_orientation actually has an issue since play() is called too late which causes the following security error:
Uncaught (in promise) DOMException: play() can only be initiated by a user gesture
My hope is to make something for a large audience, making chrome dependency definitely an issue. Basically it needs to run on most devices both iOs and Android, on safari and chrome.
I’m not sure about this. The robustness of such an app does not only depend on three.js. If the app is going to be very complex, there might be browser issues which can only be resolved if bugs are properly filed at the respective trackers. three.js is only a thin layer over the Web Audio API. This is something you should keep in mind.