These are chat archives for andyet/SimpleWebRTC
So I’m building a Chrome Extension that captures the tab content (what’s going on on a website) as a
MediaStream. I want to send that stream to my server (or preferably S3 directly from the browser). I noticed that Chrome doesn’t have a proper way to record streams (it’s coming in something called the
MediaRecorder that FF currently has).
My idea was to set up two
RTCPeerConnection and then listen to data events on it to get at the stream bytes and then send those to S3. If that isn’t possible, I was going to emulate the RTCPeerConnections and setup a socket to my server, which will then handle the bytes that get mediated over that.
Am I on the right track here? I’m not sure. I tried using RecordRTC, but it results in shitty video quality (much shittier than the raw stream - I know because I’ve attached the raw
MediaStream to a
<video> tag). I’m guessing it's because, since Chrome doesn’t have a proper
MediaRecorder yet, it attempts to performantly render frames to canvas elements and then take images of those elements and string them together into a
webm file using something called WhammyRecorder
MediaStreamin Chrome and send those S3. I'm fine with a solution that involves running a server (but prefer getting the bytes directly in the browser).