Hi all ! Big noob here, I have been making apps for years but in Max/MSP and am just beginning my web app/ThreeJS journey.
My first major project is a 3D soundboard that I have working perfectly on the desktop version of the site, however in trying to implement mobile controls (TouchControls), I have gone past where I understand what I am doing. More or less, the working example uses a third input for the controls (where all the native threejs controls use two: camera, renderer.domElement). The TouchControls seem to employ a âcontainerâ which I imagine is the scene itself, but I am not clear as to how to make that implicit in my code. Do I make the whole scene a 3D object? Do I render the JSON of the scene and use that?
Last thing, I found that if I remove the
if (self.config.hitTest)
self.hitTest();
from the touch-controls.js, the scene loads but the camera is facing the wrong way, in the wrong orientation & while I can change the cameraâs position (and have it console logged in different positions) the view itself remains static, facing the same way and not responding to touch input.
P.S. This is for a somewhat secret project, so please keep this project secret/forum based so that my release/reveal isnât stunted. Thanks so much in advance for any insight, I have read every article (and forum post) I could find on this subject and even worked with a TA from my coding bootcamp to try to save me from posting but I am so close and know I can crack it with the help of this community! Thanks again, especially to @Mugen87 whose myriad answers have unstuck me enough to get this far !
P.P.S. While I am here, I also had issues with collision detection that I couldnât seem to crack. I installed physi.js and was able to get some physics working (gravity, floor) but got turned around trying to add a 3D object to the camera (as a child) such that it would stop when it hit an object (like the wall, or the snakes, or the podium, etc). Insight here is welcome, too! thxx