Android Exact video search

I am struggling with exact search with MediaExtractor seekTo()

. While I can try to sync the frames without issue, I would like to find a specific time. This question led me to some ideas on how to do this, but I'm not sure if they are valid. Basically, I would have to search for the closest previous sync frame and then the advance()

extractor until the target time is reached. Each frame in the process will be fed to the decoder, that is, the first I-frame and the remaining P-frames. This is the related code snippet (based on
google / grafika MoviePlayer):

extractor.seekTo((long) seekTarget[threadNr], MediaExtractor.SEEK_TO_PREVIOUS_SYNC);

...

while (extractor.getSampleTime() < (long) seekTarget[threadNr]) {
    Log.d(TAG, "Thread " + threadNr + " advanced to timestamp " + extractor.getSampleTime());

    int inputBufIndex = decoder.dequeueInputBuffer(TIMEOUT_USEC);
    if (inputBufIndex >= 0) {
        ByteBuffer inBufer = decoderInputBuffers[inputBufIndex];
        int chunkSize = extractor.readSampleData(inBufer, 0);

        if (chunkSize < 0) {
            // End of stream -- send empty frame with EOS flag set.
            decoder.queueInputBuffer(inputBufIndex, 0, 0, 0L,
                    MediaCodec.BUFFER_FLAG_END_OF_STREAM);
            inputDone = true;
            if (VERBOSE) Log.d(TAG, "sent input EOS");
        } else {
            if (extractor.getSampleTrackIndex() != trackIndex) {
                Log.w(TAG, "WEIRD: got sample from track " +
                        extractor.getSampleTrackIndex() + ", expected " + trackIndex);
            }

            long presentationTimeUs = extractor.getSampleTime();
            decoder.queueInputBuffer(inputBufIndex, 0, chunkSize,
                    presentationTimeUs, 0 /*flags*/);
            if (VERBOSE) {
                Log.d(TAG, "submitted frame " + inputChunk + " to dec, size=" +
                        chunkSize + " inputBufIndex: " + inputBufIndex);
            }
            inputChunk++;
            extractor.advance();
        }
    }
}

      

As you can imagine, I used to queue a large number of frames, but at the moment I'm fine with memory consumption or possible latency. The problem is that the method dequeueInputBuffer()

only runs for some time in a loop, after all, stucks when returning -1, which according to the documentation means the buffer is not available. If I change the value TIMEOUT_USEC

to -1

, I get an infinite loop.

Can someone tell me if this approach is appropriate or why at some point I can't access it inputBuffer

?

+3


source to share


1 answer


You don't seem to be pulling buffers from the output side. The MediaCodec decoder does not drop frames, so when its internal buffers are full, it will stop sending you input buffers.



You need to flush the decoder by requesting the output buffers. When you release the buffer, set the "render" flag to false so that it is not displayed on the screen.

+5


source







All Articles