How to record gameplay on mobile browser

Hello :slight_smile:
I am building a webAR experience (ARjs with a-frame) and I need users to record short videos of their gameplay.

Here’s the desired user flow:

  1. User loads the website on his mobile (Android Chrome, iOS Safari)
  2. User plays around in the 3D environment
  3. User clicks on a “RECORD” button
  4. For the next 5-10 seconds whatever is rendered by three.js is saved as a video file
  5. The user previews his record and decides whether to save it or start over

My first approach was to try recording from outside of three.js using getUserMedia Record Stream. It does work on Android but I am having troubles with iOS.

So I am thinking - can one get the rendered frames from three.js and build a video file?

I need advice, guys :slight_smile:

Got it working.

  1. Use preserveDrawingBuffer: true (for a-frame you need to edit the library)
  2. Create a new canvas (can be display:none)
  3. On update join the three.js and the ar.js on the new canvas: newCanvas.drawImage(camerafeed), newCanvas.drawImage(three.js)
  4. On iOS save the canvas data in an array newCanvas.toDataUrl() and after the end of the recording, send the data to a server for ffmeg
  5. On Android - use webrtc to create a webm video

Hope it helps someone someday :slight_smile:

2 Likes

Hi kosmoskolio, can you please provide a working example of how you did that?

I’m affraid I won’t be able to provide any code - the project was for a client and is long closed.
The best I can do is try to explain our approach. We made it work both for iOS and for Android but using different methods.

Android:

  • create a new canvas
  • on each frame draw both the camera feed and the threejs canvas on the new canvas
  • use RecordRTC to record the combined content on the new canvas
  • if you are using aframe - we had to downgrade to version 0.7.1. (can’t remember the exact reason but I believe it had to do with different behaviour of preserveDrawingBuffer)

iOS:

  • create a new canvas
  • on each frame draw both the camera feed and the threejs canvas on the new canvas
  • on each frame get the canvas content with .toDataURL() and push it to an array or smth
  • on recording end - send the array to a webserver and build a video with it (we did it with ffmpeg)
  • for iOS we used aframe 0.8.2 (again don’t remember why)

Hope that helps.

1 Like