Pre-recorded media streaming ® Muaz Khan
© 2013 Muaz Khan<@muazkh> » @WebRTC Experiments » Google+ » What's New?
How to stream your own video?
<script src="https://www.webrtc-experiment.com/streamer.js"> </script> <script> var streamer = new Streamer(); </script>
pre-recorded media sender instance
<script> streamer.push = function (chunk) { socket.send(chunk); }; document.querySelector('input[type=file]').onchange = function () { streamer.stream(this.files[0]); }; </script>
pre-recorded media receiver instance
<script> streamer.video = document.querySelector('video'); streamer.receive(); function onData(data) { if (data.end) streamer.end(); else streamer.append(data); } </script>
socket.io/websocket to push chunks
<script> socket.onmessage = onData; // or socket.on('message', onData); </script>
Chrome >=M28 (or Firefox with flag "media.mediasource.enabled=true") is mandatory to test this experiment.
- Streaming pre-recorded video (media file)
- Currently, using Firebase for streaming chunks of data because MediaSource APIs are only supported on chrome canary which has unreliable RTP (RTCDataChannel) streams. Note: SCTP data channels landed on chrome M >=30 under a flag.
- Streaming WebM files only (in the moment!)
- WebM file's size must be less than 1000KB; otherwise it will fail. It is a bug will be fixed soon.
It is an early release!
- This experiment is an early release. In future, RTCDataChannel APIs will be used to stream pre-recorded media in realtime! (until "video.captureStream" gets implemented. Note: Firefox implemented a prefixed version of captureStream API i.e.
mozCaptureStreamUntilEnded
) - MediaSource APIs are not made for streaming pre-recorded Medias, though!
- Waiting for "video.captureStream" implementation that is proposed for pre-recorded media streaming, unfortunately still in draft!
In future:
partial interface HTMLMediaElement { readonly attribute MediaStream stream; MediaStream captureStream(); MediaStream captureStreamUntilEnded(); readonly attribute boolean audioCaptured; attribute any src; };
We will be able to get stream from video like this:
video.src = 'your pre-recorded webm/etc. video'; var preRecordedStream = video.captureStream(); peer.addStream ( preRecordedStream );
Other possibilities
var videoStream = video.createMediaElementSource(); var audioStream = audio.createMediaElementSource();
"createMediaElementSource" is also landed on chrome M >= 30. It is also useful for streaming pre-recorded medias.
How this experiment works?
- Getting access to WebM video file using File API
- Reading it as array buffer using File Reader API
- Splitting buffers in predefined small chunks; and posting/transmitting those chunks in a loop using Firebase.
- As soon as other party receives first chunk; MediaSource API will start playing video without waiting for all chunks to be downloaded!
- You can save/store/record those chunks in any database; because it is a typed array [Uint8Array].
Let's say you want to:
- Stream 5min to 7min of video data i.e. total two minutes of video data (over all sockets) from first WebM file.
- Then, quickly you want to stream 17 to 19 minutes i.e. total two minutes of data from second WebM file.
- Then you want to stream 11 to 15 minutes i.e. total 4 minutes of data from third WebM file.
You can do all such things today!
In simple words; you can stream part of video from first WebM file; part of video from second WebM file and so on, in realtime!
Suggestions
- If you're newcomer, newbie or beginner; you're suggested to try RTCMultiConnection.js or DataChannel.js libraries.