Is it possible to get good FPS with the Ribberry Pi v4l2 camera in C ++?

I'm trying to stream video to a Raspberry Pi using the official V4L2 driver with a Raspberry Pi camera, from C ++ to raspbian (2015-02 release) and I'm having low FPS issues.

Currently I just create a window and copy the buffer to the screen (which takes about 30ms) whereas it select()

takes about 140ms (only 5-6 frames per second). I've also tried sleeping for 100ms and reduces the time select()

by a similar amount (resulting in the same fps). The processor load is about 5-15%.

I also tried changing the fps driver from console (or system()

), but it only works downward (for example, if I set the fps driver to 1fps, I get 1fps, but if I set it to 90fps, I still get 5-6fps although the driver confirms the setting at 90fps). Also, when asking for FPS modes for the resolution being used, I get 90 frames per second.

I have included the V4L2-related parts of the code (code skipped between different parts):   

//////////////////
// Open device
//////////////////
mFD = open(mDevName, O_RDWR | O_NONBLOCK, 0);
if (mFD == -1) ErrnoExit("Open device failed");

//////////////////
// Setup format
//////////////////
struct v4l2_format fmt;
memset(&fmt, 0, sizeof(fmt));
fmt.fmt.pix.pixelformat = V4L2_PIX_FMT_YUYV;
fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
Xioctl(VIDIOC_G_FMT, &fmt);
mImgWidth = fmt.fmt.pix.width;
mImgHeight = fmt.fmt.pix.height;
cout << "width=" << mImgWidth << " height=" << mImgHeight << "\nbytesperline=" << fmt.fmt.pix.bytesperline << " sizeimage=" << fmt.fmt.pix.sizeimage << "\n";
// For some reason querying the format always sets pixelformat to JPEG
//  no matter the input, so set it back to YUYV
fmt.fmt.pix.pixelformat = V4L2_PIX_FMT_YUYV;
if (Xioctl(VIDIOC_S_FMT, &fmt) == -1)
{
    cout << "Set video format failed : " << strerror(errno) << "\n";
}

//////////////////
// Setup streaming
//////////////////
struct v4l2_requestbuffers req;

memset(&req, 0, sizeof(req));

req.count = 20;
req.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
req.memory = V4L2_MEMORY_MMAP;

if (-1 == Xioctl(VIDIOC_REQBUFS, &req))
{
    ErrnoExit("Reqbufs");
}
if (req.count < 2)
    throw "Not enough buffer memory !";
mNBuffers = req.count;
mBuffers = new CBuffer[mNBuffers];
if (!mBuffers) throw "Out of memory !";

for (unsigned int i = 0; i < mNBuffers; i++)
{
    struct v4l2_buffer buf;
    memset(&buf, 0, sizeof(buf));
    buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
    buf.memory = V4L2_MEMORY_MMAP;

    buf.index = i;

    if (-1 == Xioctl(VIDIOC_QUERYBUF, &buf))
        ErrnoExit("Querybuf");

    mBuffers[i].mLength = buf.length;
    mBuffers[i].pStart = mmap(NULL, buf.length, PROT_READ | PROT_WRITE, MAP_SHARED, mFD, buf.m.offset);

    if (mBuffers[i].pStart == MAP_FAILED)
        ErrnoExit("mmap");
}

//////////////////
// Start streaming
//////////////////
unsigned int i;
enum v4l2_buf_type type;
struct v4l2_buffer buf;

for (i = 0; i < mNBuffers; i++)
{
    memset(&buf, 0, sizeof(buf));

    buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
    buf.memory = V4L2_MEMORY_MMAP;
    buf.index = i;

    if (-1 == Xioctl(VIDIOC_QBUF, &buf))
        ErrnoExit("QBUF");
}
type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

if (-1==Xioctl(VIDIOC_STREAMON, &type))
    ErrnoExit("STREAMON");

      

And the last two parts of the main loop:

//////////////////
// Get frame
//////////////////
FD_ZERO(&fds);
    FD_SET(mFD, &fds);
    tv.tv_sec = 3;
    tv.tv_usec = 0;

    struct timespec t0, t1;

    clock_gettime(CLOCK_REALTIME, &t0);

    // This line takes about 140ms which I don't get
    r = select(mFD + 1, &fds, NULL, NULL, &tv);

    clock_gettime(CLOCK_REALTIME, &t1);

    cout << "select time : " << ((float)(t1.tv_sec - t0.tv_sec))*1000.0f + ((float)(t1.tv_nsec - t0.tv_nsec))/1000000.0f << "\n";

    if (-1 == r)
    {
        if (EINTR == errno)
            continue;
        ErrnoExit("select");
    }

    if (r == 0)
        throw "Select timeout\n";

    // Read the frame
    //~ struct v4l2_buffer buf;
    memset(&mCurBuf, 0, sizeof(mCurBuf));
    mCurBuf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
    mCurBuf.memory = V4L2_MEMORY_MMAP;

    // DQBUF about 2ms
    if (-1 == Xioctl(VIDIOC_DQBUF, &mCurBuf))
    {
        if (errno == EAGAIN) continue;
        ErrnoExit("DQBUF");
    }

    clock_gettime(CLOCK_REALTIME, &mCaptureTime);

    // Manage frame in mBuffers[buf.index]
    mCurBufIndex = mCurBuf.index;

    break;
}

//////////////////
// Release frame
//////////////////
if (-1 == Xioctl(VIDIOC_QBUF, &mCurBuf))
    ErrnoExit("VIDIOC_QBUF during mainloop");

      

+3


source to share


2 answers


I've looked at various ways to use the picamera and I'm hardly an expert, but it looks like the default camera settings are what's holding you back. There are many modes and switches. I don't know if they were opened via ioctls or whatever, I just started. But I had to use a program called v4l-ctl to get things ready for the mode I wanted. A deep look at this source and some code elevation should allow you to achieve greatness. Oh and I doubt the select call is a problem, it just waits for the handle to slowly become readable. Depending on the mode, etc. There may be mandatory waiting for autoexposure, etc. Edit: I wanted to say "default setting" as you changed some. There are also rules that are not encoded in the driver.



+1


source


The pixel format matters. I faced a similar low fps problem and I spent some time testing my program in Go and C ++ using the V4L2 API. What I found Rpi Cam Module has good acceleration with H.264 / MJPG pixelformat. I can easily get 60fps at 640 * 480, the same as without compressed formats like YUYV / RGB. However, JPEG is very slow. I can only get 4fps even at 320 * 240. And I also found that the current is higher (> 700mA) with JPEG versus 500mA with H.264 / MJPG.



-1


source







All Articles