It receives a live stream, via WebSocket.
It is prepared to use any other Transport method available. It just needs to follow the same interface as the included built-in WebSocketClient.
This WebSocketClient has the logic to reconnect every 5 seconds
TODO: add some logarithmic/fibonacci/exponential back-off method
npm install and then
npm start to start the HTTP and WebSockets server.
In an other terminal, Then, to start streaming, you can use a camera that provides a RTSP feed, or your own laptop
# Laptop Web cam feed./start_ffmpeg_stream.sh /dev/video0
# Camera Feed./start_ffmpeg_stream.sh "rtsp://192.168.1.54:554/axis-media/media.amp?videocodec=h264&resolution=640x480"
To build, just run
var player = new jsmpeglive(uri[, options])
uri argument accepts a WebSocket address for streaming playback.
options argument to
jsmpeglive() supports the following properties:
benchmarkwhether to log benchmark results to the browser's console
canvasthe HTML Canvas element to use; jsmpeglive will create its own Canvas element if none is provided
ondecodeframea function that's called after every frame that's decoded and rendered to the canvas
The best example is just checking the source code. Feel free to review stream-example.html.
var player = 'ws://localhost:8084/' canvas:canvas;
The Transport layer was removed from the main jsmpeg decoding object. Also, as I didn't need recording, it was removed, as well as all the functions for playing local video files.
- Only raw MPEG video streams are supported. The decoder hates Stream Packet Headers in between macroblocks.
It is based on the work of Dominic Szablewski's jsmpeg