Send MediaStream object with Web Audio effects over PeerConnection

I am trying to send audio received with getUserMedia()

and modified with the Web Audio API via PeerConnection from WebRTC. The Web Audio API and WebRTC seem to have the ability to do this, but I am having trouble figuring out how to do this. In the Web Audio API, an object AudioContext

contains a method createMediaStreamSource()

that provides a way to connect to the MediaStream obtained with getUserMedia (). Additionally, there is a method createMediaStreamDestination()

that appears to return an object with a stream attribute.

I am getting both audio and video from the getUserMedia () method. I'm having trouble how to pass this stream object (with audio and video) to these methods (ex: createMediaStreamSource ())? Do I need to first extract the audio from the stream (getAudioTracks) and find a way to combine it with the video? Or am I passing it as is and it leaves the video unaffected? Can I change the sound just once (before adding it to the PeerConnection)?

+3


source to share


2 answers


The method createMediaStreamSource()

takes an object as its parameter MediaStream

, which then takes the first AudioMediaStreamTrack

from this object, which will be used as the sound source. This can be used with a MediaStream object obtained from a method getUserMedia()

, even if that object contains both audio and video. For example:

var source = context.createMediaStreamSource(localStream);

      

Where "context" in the above code is an object AudioContext

and "localStream" is a MediaStream object obtained from getUserMedia (). The method createMediaStreamDestination()

creates a target node that has a MediaStream object in its "stream" attribute. This MediaStream object contains only one AudioMediaStreamTrack (even if the input stream to the source contained audio and video or multiple audio tracks): a modified version of the track obtained from the stream in the source. For example:

var destination = context.createMediaStreamDestination();

      

Now, before you can access the stream attribute of the target variable you just created, you must create an audio graph by connecting all the nodes together. For this example, suppose we have a filter named BiquadFilter node:

source.connect(filter);
filter.connect(destination);

      

Then we can get the stream attribute from the target variable. And this can be used to add to an object PeerConnection

to send to a remote host:



peerConnection.addStream(destination.stream);

      

Note. The stream attribute contains a MediaStream object with a modified AudioMediaStreamTrack. Therefore, there is no video. If you want the video to be sent too, you have to add this track to the stream object that contains the video track:

var audioTracks = destination.stream.getAudioTracks();
var track = audioTracks[0]; //stream only contains one audio track
localStream.addTrack(track);
peerConnection.addStream(localStream);

      

Keep in mind that the method addTrack

will not add a track if the MediaStream already has the same identifier. So you might have to first delete the track you got in the original node.

The sound must be changed at any time by adjusting the values ​​at intermediate nodes (between source and destination). This is because the stream passes through the nodes before being sent to another peer. Check out this example for dynamically changing the effect on the recorded audio (should be the same for the stream). Note. I have not tested this code yet. While it works in theory, there might be some cross-browser issues as the Web Audio and WebRTC APIs are in working draft and not yet standardized. I am assuming it will work in Mozilla Firefox and Google Chrome.

Link

+4


source


@Android Student's answer is good according to the current specs - however there are problems in the implementation of the specs with Firefox and Chrome.

The last thing I checked in Chrome was unable to handle WebRTC output via WebAudio, while Firefox can.



However, there are two bugs blocking Firefox from using source streams generated by WebAudio in PeerConnection, one of which is now fixed in Nightly and Aurora, and the other is coming soon. However Firefox doesn't yet implement stream.addTrack, but that's another complication. Chrome seems to be able to handle streams received from WebAudio in PeerConnection.

+2


source







All Articles