TouchControls Help

Hi all ! Big noob here, I have been making apps for years but in Max/MSP and am just beginning my web app/ThreeJS journey.

My first major project is a 3D soundboard that I have working perfectly on the desktop version of the site, however in trying to implement mobile controls (TouchControls), I have gone past where I understand what I am doing. More or less, the working example uses a third input for the controls (where all the native threejs controls use two: camera, renderer.domElement). The TouchControls seem to employ a “container” which I imagine is the scene itself, but I am not clear as to how to make that implicit in my code. Do I make the whole scene a 3D object? Do I render the JSON of the scene and use that?

Last thing, I found that if I remove the

if (self.config.hitTest)

from the touch-controls.js, the scene loads but the camera is facing the wrong way, in the wrong orientation & while I can change the camera’s position (and have it console logged in different positions) the view itself remains static, facing the same way and not responding to touch input.

Live site

Source code

P.S. This is for a somewhat secret project, so please keep this project secret/forum based so that my release/reveal isn’t stunted. Thanks so much in advance for any insight, I have read every article (and forum post) I could find on this subject and even worked with a TA from my coding bootcamp to try to save me from posting but I am so close and know I can crack it with the help of this community! Thanks again, especially to @Mugen87 whose myriad answers have unstuck me enough to get this far !

P.P.S. While I am here, I also had issues with collision detection that I couldn’t seem to crack. I installed physi.js and was able to get some physics working (gravity, floor) but got turned around trying to add a 3D object to the camera (as a child) such that it would stop when it hit an object (like the wall, or the snakes, or the podium, etc). Insight here is welcome, too! thxx

1 Like

Hello again ! I was able to get the controls to appear and react to touch events, but there is still one major problem that despite my best efforts I cannot seem to solve.

In all of the other control libraries (FirstPerson, Orbit, etc), when the view of the scene changes, so does the position of the camera. However, in the TouchControls library, the camera doesn’t change positions when console logged.

Instead, in the TouchControls.js the camera is passed into a 3D object called the fpsBody and then the fpsBody is passed into another 3D object called “cameraHolder”, both of which seem to be moving around the space with the camera inside of 'em.

Because I can’t figure out what is actually changing position, I can’t seem to limit or control the TouchControls experience. There are two fairly simple things I need to accomplish:

  1. Fix the rotation pad such that it rotates from WITHIN the camera, as if looking around first-person style. Right now, depending on where the movement pad has recentered, it seems to calculate a radius and rotate around that (and allows the user to pass through the floor and rise up off the ground, which is not the desired effect.)

  2. Limit the movement pad, similar to the camera’s limitations in the desktop version, on the x and z axis such that it won’t pass through the ‘walls’ of my room. This was achieved by setting minimum and maximum X and Z coordinates for the camera, but in TouchControls the camera is not what is moving apparently.

I believe if I can figure out what is moving the user’s view (fpsBody, cameraHolder, camera, or something else) I can set the appropriate restrictions. If anyone has some time to help, I am on the ThreeJS discord server and will be working on this all day. Thanks in advance for any insight, I am very grateful for this community :slight_smile: