These are chat archives for andyet/SimpleWebRTC

2nd
Feb 2016
Juan Cristóbal Olivares
@juancri
Feb 02 2016 02:56
on ubuntu, just run: apt-get install coturn
Vinay
@vhmth
Feb 02 2016 03:11
Hey guys had a question - lots of info.

So I’m building a Chrome Extension that captures the tab content (what’s going on on a website) as a MediaStream. I want to send that stream to my server (or preferably S3 directly from the browser). I noticed that Chrome doesn’t have a proper way to record streams (it’s coming in something called the MediaRecorder that FF currently has).

My idea was to set up two RTCPeerConnection and then listen to data events on it to get at the stream bytes and then send those to S3. If that isn’t possible, I was going to emulate the RTCPeerConnections and setup a socket to my server, which will then handle the bytes that get mediated over that.

Am I on the right track here? I’m not sure. I tried using RecordRTC, but it results in shitty video quality (much shittier than the raw stream - I know because I’ve attached the raw MediaStream to a <video> tag). I’m guessing it's because, since Chrome doesn’t have a proper MediaRecorder yet, it attempts to performantly render frames to canvas elements and then take images of those elements and string them together into a webm file using something called WhammyRecorder

Vinay
@vhmth
Feb 02 2016 03:16
For those who don't want to read that. TLDR: I want to be able to get at the bytes of a video MediaStream in Chrome and send those S3. I'm fine with a solution that involves running a server (but prefer getting the bytes directly in the browser).
Vinay
@vhmth
Feb 02 2016 05:36
Ok I think I figured out that I'll just have to establish an RTCPeerConnection with a server in some way and then grab the data from there. Does anyone have any suggestions for node libs that will let me do this? Seems like a lot of them are either abandoned or not fully implemented and don't implement MediaStreams