PositionalAudio setMediaStreamSource with WebRTC Question (not hearing any sound)

Hi All!

I asked this on the discord, but didn’t get a response. I’ve added some more information below which I hope will help clarify my question.

I have an incoming WebRTC audio/video stream which I am splitting into a video stream for use as a texture, and an audio stream which I am trying to get working as the source for a positional audio object. I am using the positionalAudio.setMediaStreamSource function with release r115. After I run the following code, I get an Audio object in the scene (i.e. if i print out scene.children, I see the object with a MediaStreamSourceNode under AudioObj.context.source).

But, I don’t hear anything.

Any advice on how to debug this would be super helpful!

let audioStream = new MediaStream([incomingWebRTCStream.getAudioTracks()[0]]);

let audioSource = new THREE.PositionalAudio(myAudioListener);
audioSource.setMediaStreamSource(audioStream);
audioSource.setRefDistance(5);
audioSource.setMaxDistance(20);
audioSource.setRolloffFactor(5);
audioSource.setVolume(1);

scene.add(audioSource);

Related Questions:

Can you please first try it with THREE.Audio instead of THREE.PositionalAudio? Just to ensure that it is no transformation issue.

Ok, confirmed that that does not work (no sound) with THREE.Audio():

let audioStream = new MediaStream([incomingWebRTCStream.getAudioTracks()[0]]);
let audioSource = new THREE.Audio(myAudioListener);
audioSource.setMediaStreamSource(audioStream);

Interestingly,

  1. this does work with my local audio stream from Navigator.GetUserMedia()
  2. when I pipe the same audioStream into an
let audioStream = new MediaStream([incomingWebRTCStream.getAudioTracks()[0]]);
let audioEl = document.createElement('audio');
audioEl.srcObject = audioStream;
audioEl.play();
document.body.appendChild(audioEl);

Any thoughts?

Can you share your code as a git repository? Or maybe as an editable live example?

Sure. The relevant code is in public/js/index.js: 260 (I just quickly adjusted to work without my WebRTC ICE server provider (Twilio), so if anything isn’t working, let me know and I’ll take a look)

Thank you!

After downloading the repo and installing the dependencies I get the following runtime error when using npm start:

/YORB2020-master/node_modules/twilio/lib/rest/Twilio.js:126
throw new Error(‘username is required’);
^
Error: username is required
at new Twilio (/YORB2020-master/node_modules/twilio/lib/rest/Twilio.js:126:11)
at initializer (/YORB2020-master/node_modules/twilio/lib/index.js:9:10)
at Object. (/YORB2020-master/server.js:57:39)

How do I start the application?

I just pushed a change (removing Twilio entirely). If you git pull, it should work now. Let me know if that is not the case.

Thanks!

The minimal setup should be:

let audioSource = new THREE.Audio(glScene.listener);
audioSource.setMediaStreamSource(audioStream);

And that actually works in Firefox, but not in Chrome. Can you please also make a test with FF?

Okay. Tested the following with Firefox (75.0) and heard the incoming WebRTC stream:

let audioStream = new MediaStream([_remoteStream.getAudioTracks()[0]]);
let audioSource = new THREE.Audio(glScene.listener);
audioSource.setMediaStreamSource(audioStream);

then tested with positionalAudio and also heard the incoming stream , but without positional effects. (EDIT: PositionalAudio is working in FireFox using the code below.) I played with the RolloffFactor and RefDistance and DistanceModel as well, but to no avail. Ideally, I’d like to hear the sound when within 5 units or so, then not hear it at all. (I am assuming distance units match three.js world units?)

let audioSource = new THREE.PositionalAudio(glScene.listener);
audioSource.setMediaStreamSource(audioStream);
audioSource.setRefDistance(20);
glScene.scene.add(audioSource);

I’m wondering whether it is okay that my AudioListener is parented to a group rather than an object3D?

Normally, the listener should be added to the camera.

After some research, it seems the minimal setup reproduces this bug in Chrome:

https://bugs.chromium.org/p/chromium/issues/detail?id=933677

Okay. See above edit (the above code works in Firefox 75.0) and PositionalAudio works as well. Should I submit an issue for Chrome? Or is there more I can do to debug on my end?

Thank you again!

There is already an issue. Please check out the link in my previous post.

Unfortunately no, this is a browser bug.

I meant an issue on the three.js repo, just so it is visible to the three.js community? Let me know.

Thanks again for the help! That was a persnickety little bug.

I think the discussion in this topic is sufficient. Especially since we can’t fix this issue on our side.

1 Like

Hi @Mugen87, just wanted to follow up on this thread as it seems the Chromium bug thread has now referenced this thread. A user there has mention a possible solution:

  1. Create an HTML5 audio element,
  2. Set the audio element source to the incoming MediaStream,
  3. Use that audio element as a source for the webAudio positional audio object.

I am trying to implement this, but am running into the issue of both audio sources (the HTML5 audio element and the WebAudio / three.js positionalAudio) now playing (and therefore losing the the stereo imaging / positional-ity of the positionalAudio). Do you have any recommendations for this?

That user was me :blush:.

Um, that does not sound right to me. I’ve tested this workaround successfully on my computer :thinking:

Ahh, wonderful! Well, maybe you could tell me what I was doing wrong here? I am using Mediasoup as a selective forwarding unit server, so consumer refers to the incoming media source.

// Create HTML audio element
var el = document.createElement('audio');
document.body.appendChild(el);
el.controls = 'controls'; 
// when this is active, audio plays globally
// el.setAttribute('autoplay', true);

// set html element source object to incoming stream:
el.srcObject = new MediaStream([consumer.track.clone()]);
el.consumer = consumer;

// THREE.js PositionalAudio
let audioSource = new THREE.PositionalAudio(this.listener);
audioSource.setRefDistance(10);
audioSource.setRolloffFactor(10);
// add to our 'clients' object
clients[_id].group.add(audioSource);
clients[_id].positionalAudioSource = audioSource;
// set audiosource
audioSource.setMediaElementSource(el);

These two lines should not be necessary.

I was mainly using those two lines to test different scenarios. Here is my current issue:

When the HTML audio element is playing:

  • In Firefox (75.0), I can set the HTML audio element volume to 0 and can still hear the positionalAudio. :slight_smile:
  • In Chrome, when I set the HTML audio element volume to 0, I cannot hear positionalAudio. To be clear, it is unclear whether I am hearing positionalAudio when the HTML audio volume is active over the sound of the global audio. :frowning:

When the HTML audio is not playing:

  • In Firefox and Chrome, I don’t hear the PositionalAudio object.

Could you share your workaround test code? I’m wondering if I am missing something simple…

This example uses an HTML5 audio element and passes it to an instance of PositionalAudio.

https://threejs.org/examples/webaudio_orientation

The only difference is the way how the audio element is created. Although this should not matter.