Recently I investigated whether I could implement software streaming of mjpeg video on a camera that uses web2py and python as its HTTP interface. Web2py makes most common operations quite straightforward and streaming is no exception, although I had to go digging for some details, this post along with SergeyPo’s answer to this question were of immense help.
mjpeg or Motion JPEG streaming over HTTP is a popular way of streaming video from IP cameras. If your camera has already gone to the effort of jpeg’ing its frames then it is reasonably cheap to stream them out via software over HTTP as an mjpeg video stream (as opposed to encoding to h264 etc.).
Mjpeg can be viewed natively by most browsers (except Internet Explorer) and can also be viewed on video viewers like VLC etc. An mjpeg stream is quite simple, it really just contains the individual jpeg frames’ data separated by a defined frame boundary marker.
Web2py takes care of chunked streaming via the stream() function, we just have to write a file object that we pass to stream(), this will take care of loading the jpeg frames and providing the data back to stream(), which in turn will stream this down the line to the client.
So to create an mjpeg stream we:
1.) Add a web2py handler for the stream, for example /mjpeg
2.) Add the following to the response headers: “multipart/x-mixed-replace; boundary=the_answer_is_42”
“the_answer_is_42” is just a made-up boundary marker which we hope won’t appear in the jpeg frame data, it can be changed to something else.
3.) Call stream() passing an instance of our MJPEGStreamer file object (see #4!), and a fairly arbitrary chunk size.
4.) Define a file compatible object that will loop ‘forever’ and provide the data to stream(), it will provide the data for the individual jpeg frames as well as insert the frame boundaries (–the_answer_is_42) between frames, it also inserts the headers for the individual frames (Content-Type: image/jpeg).
The following provides an overview of a stream implementation. Here we imagine that the latest jpeg frame is to be found in a ram-disk file called image.jpg, we load this data so that we can stream it to the client.
So in our web2py controller we have something like:
# Proof of concept web2py mjpeg streamer
# Set the initial response header
response.headers['Content-Type'] = 'multipart/x-mixed-replace; boundary=the_answer_is_42'
# stream the mjpeg data (via MJPEGStreamer) with a chunk size of
# 30000, the chuck size must be less than the size of the smallest
# jpeg frame or the stream will stall....
return response.stream(MJPEGStreamer(), 30000)
self.iterator = self.make_iterator()
out = ''
# Read a jpeg frame from image.jpg
data = open('/media/ram/image.jpg', 'rb').read()
# Add the frame boundary to the output
out += "--the_answer_is_42\r\n"
# Add the jpg frame header
out += "Content-Type: image/jpeg\r\n"
# Add the frame content length
out += "Content-length: "+str(len(data))+"\r\n\r\n"
# Add the actual binary jpeg frame data
out += data
# Sleep for a bit..
# stream() calls read() to get data.
def read(self, n):
# Get some more stream data
n = self.iterator.next()
To view the mjpeg stream, open VLC viewer and choose Media / Open Network Stream and provide the stream URL (e.g. http://my_streaming_host/mjpeg) and hit play…
I found that it was important to set the chunk size to be (a good bit?) less than the data size of the smallest frame to be sent or else the stream will stall…