I asked this on the discord, but didn’t get a response. I’ve added some more information below which I hope will help clarify my question.
I have an incoming WebRTC audio/video stream which I am splitting into a video stream for use as a texture, and an audio stream which I am trying to get working as the source for a positional audio object. I am using the positionalAudio.setMediaStreamSource function with release r115. After I run the following code, I get an Audio object in the scene (i.e. if i print out scene.children, I see the object with a MediaStreamSourceNode under AudioObj.context.source).
But, I don’t hear anything.
Any advice on how to debug this would be super helpful!
let audioStream = new MediaStream([incomingWebRTCStream.getAudioTracks()[0]]);
let audioSource = new THREE.PositionalAudio(myAudioListener);
audioSource.setMediaStreamSource(audioStream);
audioSource.setRefDistance(5);
audioSource.setMaxDistance(20);
audioSource.setRolloffFactor(5);
audioSource.setVolume(1);
scene.add(audioSource);
Ok, confirmed that that does not work (no sound) with THREE.Audio():
let audioStream = new MediaStream([incomingWebRTCStream.getAudioTracks()[0]]);
let audioSource = new THREE.Audio(myAudioListener);
audioSource.setMediaStreamSource(audioStream);
Interestingly,
this does work with my local audio stream from Navigator.GetUserMedia()
when I pipe the same audioStream into an
let audioStream = new MediaStream([incomingWebRTCStream.getAudioTracks()[0]]);
let audioEl = document.createElement('audio');
audioEl.srcObject = audioStream;
audioEl.play();
document.body.appendChild(audioEl);
Sure. The relevant code is in public/js/index.js: 260 (I just quickly adjusted to work without my WebRTC ICE server provider (Twilio), so if anything isn’t working, let me know and I’ll take a look)
After downloading the repo and installing the dependencies I get the following runtime error when using npm start:
/YORB2020-master/node_modules/twilio/lib/rest/Twilio.js:126
throw new Error(‘username is required’);
^
Error: username is required
at new Twilio (/YORB2020-master/node_modules/twilio/lib/rest/Twilio.js:126:11)
at initializer (/YORB2020-master/node_modules/twilio/lib/index.js:9:10)
at Object. (/YORB2020-master/server.js:57:39)
Okay. Tested the following with Firefox (75.0) and heard the incoming WebRTC stream:
let audioStream = new MediaStream([_remoteStream.getAudioTracks()[0]]);
let audioSource = new THREE.Audio(glScene.listener);
audioSource.setMediaStreamSource(audioStream);
then tested with positionalAudio and also heard the incoming stream , but without positional effects.(EDIT: PositionalAudio is working in FireFox using the code below.) I played with the RolloffFactor and RefDistance and DistanceModel as well, but to no avail. Ideally, I’d like to hear the sound when within 5 units or so, then not hear it at all. (I am assuming distance units match three.js world units?)
let audioSource = new THREE.PositionalAudio(glScene.listener);
audioSource.setMediaStreamSource(audioStream);
audioSource.setRefDistance(20);
glScene.scene.add(audioSource);
I’m wondering whether it is okay that my AudioListener is parented to a group rather than an object3D?
Okay. See above edit (the above code works in Firefox 75.0) and PositionalAudio works as well. Should I submit an issue for Chrome? Or is there more I can do to debug on my end?
Hi @Mugen87, just wanted to follow up on this thread as it seems the Chromium bug thread has now referenced this thread. A user there has mention a possible solution:
Create an HTML5 audio element,
Set the audio element source to the incoming MediaStream,
Use that audio element as a source for the webAudio positional audio object.
I am trying to implement this, but am running into the issue of both audio sources (the HTML5 audio element and the WebAudio / three.js positionalAudio) now playing (and therefore losing the the stereo imaging / positional-ity of the positionalAudio). Do you have any recommendations for this?
Ahh, wonderful! Well, maybe you could tell me what I was doing wrong here? I am using Mediasoup as a selective forwarding unit server, so consumer refers to the incoming media source.
// Create HTML audio element
var el = document.createElement('audio');
document.body.appendChild(el);
el.controls = 'controls';
// when this is active, audio plays globally
// el.setAttribute('autoplay', true);
// set html element source object to incoming stream:
el.srcObject = new MediaStream([consumer.track.clone()]);
el.consumer = consumer;
// THREE.js PositionalAudio
let audioSource = new THREE.PositionalAudio(this.listener);
audioSource.setRefDistance(10);
audioSource.setRolloffFactor(10);
// add to our 'clients' object
clients[_id].group.add(audioSource);
clients[_id].positionalAudioSource = audioSource;
// set audiosource
audioSource.setMediaElementSource(el);
I was mainly using those two lines to test different scenarios. Here is my current issue:
When the HTML audio element is playing:
In Firefox (75.0), I can set the HTML audio element volume to 0 and can still hear the positionalAudio.
In Chrome, when I set the HTML audio element volume to 0, I cannot hear positionalAudio. To be clear, it is unclear whether I am hearing positionalAudio when the HTML audio volume is active over the sound of the global audio.
When the HTML audio is not playing:
In Firefox and Chrome, I don’t hear the PositionalAudio object.
Could you share your workaround test code? I’m wondering if I am missing something simple…