Java - record video and audio simultaneously in FFmpeg

Hi fellow developers,

I am currently developing a tool that can render video by executing FFmpeg using a Java process and feeding it video frames.

I am currently using the following FFmpeg command: ffmpeg -y -f rawvideo -pix_fmt rgb24 -s %WIDTH%x%HEIGHT% -r %FPS% -i - -an -c:v libx264 -preset ultrafast -pix_fmt yuv420p "%FILENAME%.mp4"

where padding elements are obviously replaced with real values.

The code I am using to initialize FFmpeg:

    //commandArgs is a list of command line arguments for FFmpeg

    List<String> command = new ArrayList<String>();
    command.add("ffmpeg");
    command.addAll(commandArgs);

    process = new ProcessBuilder(command).directory(outputFolder).start();
    OutputStream exportLogOut = new FileOutputStream("export.log");
    new StreamPipe(process.getInputStream(), exportLogOut).start();
    new StreamPipe(process.getErrorStream(), exportLogOut).start();
    outputStream = process.getOutputStream();
    channel = Channels.newChannel(outputStream);

      

Then I have the following way to write a ByteBuffer containing a Video Frame for FFmpeg:

public void consume(ByteBuffer buf) {
    try {
        channel.write(buf);
        ByteBufferPool.release(buf);
    } catch(Exception e) {
        e.printStackTrace();
    }
}

      

Now, my question is, how will I continue to write synchronous audio data to the output file? I guess I need to use multiple pipes and of course I will have to change the command line arguments, but I need help:

1) what kind of Audio Data do I need to feed FFmpeg with?
2) how do I feed Audio and Video in one go?
3) how do I keep Audio and Video synchronized?

      

Thanks in advance for your help!

Hello CrushedPixel

+3


source to share


2 answers


This requires multiplex formats, ideally you want to use multiplex format to transfer data to FFmpeg. An example of how FFmpeg does this internally is the interaction between ffmpeg.exe and ffserver.exe, and it does it through a custom / internal streaming file format called FFM . Full implementation details can be found here . Obviously, you can also use other mux formats as simple as AVI. Synchronization is done automatically as the file provides timestamps.



As far as the audio data type is concerned, it really could be anything, most people will use raw, interleaved PCM audio (either float or int16).

+3


source


Take a look at https://github.com/artclarke/humble-video which is a wrapper around ffmpeg in java. You can add video / audio streams dynamically to the encoder.



+1


source







All Articles