Encode and stream continuous PNG image files as live streaming in a web browser

I have an Open GL application that displays a simulation animation and outputs multiple PNG image files per second and saves those files to disk. I want to stream these image files as a video streaming over HTTP protocol so that I can watch an animated video from a web browser. I already have a reliable socket server that handles the websocket connection and I can handle all of the handshake and message encoding / decoding part. My server program and OpenGL application are written in C ++.

A few questions:

  • What is the best way to pipe this OpenGL animation output and view it from my web browser? Video frames are dynamically (continuously) generated by the OpenGL application as PNG image files. The web browser should display video that matches the Open GL output display (with minimum latency).

  • How can I encode these PNG image files as continuous (live) video programmatically using C / C ++ (without having to manually drag and drop the image files into streaming server software like Flash Media Live Encoder)? What video format should you create?

  • Should I send / receive animation data using websocket, or are there any other better ways? (like JQuery Ajax for example, I'm just doing this, but please guide me through the correct way to implement this). It will be great if this streaming video streaming works in different browsers.

  • Does HTML5 video text support live video streaming or only works for a complete video file that exists in a specific url (not live)?

  • Are there any existing code samples (tutorial) for this live streaming where you have a C / C ++ / Java application generating some image frames and a web browser consuming that output as a video streaming ? I could barely find tutorials on this topic after spending a few hours searching on Google.

+3


source to share


2 answers


You definitely want to stop outputting PNG files to disk and instead input the image data frames into the video encoder. A good bet is to use libav / ffmpeg. Then you will need to encapsulate the encoded video into a network format. I would recommend x264 as an encoder and MPEG4 or MPEG2TS stream.

To view the video in a web browser, you need to select a streaming format. HLS in HTML5 is supported by Safari, but unfortunately not much more. To support a broad client, you will need to use a plugin such as flash or media player.



The easiest way I can think of is to use Wowza for a server side reload. GL will stream MPEG2 TS to Wowza and then prepare streams for HLS, RTMP (flash), RTSP and Microsoft Smooth Streaming (Silverlight). It costs about $ 1,000. You can set up an RTMP stream using Red5, which is free. Or you can do RTSP that works with VLC, but RTSP clients are ubiquitous everywhere.

Unfortunately, at the moment the standardization level for web video is very low and the video tool is quite cumbersome. It's a big start, but you can hack ffmpeg / libav. A proof-of-concept could be writing YUV420p image frames to a channel that ffmpeg listens on and choosing an output stream that you can read with an RTSP client like VLC, Quicktime, or Windows Media Player.

+3


source


Most live videos are MPEG2 internally, wrapped as RTMP (Flash) or HLS (Apple). There is probably a way to render OpenGL for frames and convert them to MPEG2 like live streaming, but I don't know exactly how (FFMPEG perhaps?). After that, you can start the stream through Flash Media Live Encoder (free) and stream it to Flash clients directly using RTMP, or click publish to Wowza Media Server to package it for Flash, Http Live Streaming (Cupertino), Smooth Streaming for Silverlight.



Basically, you can pipe some COTS solutions and play on a standard player without using sockets and low-level stuff.

0


source







All Articles