Is there an expected queue in C ++?

I use concurrency::task

from ppltasks.h

heavily in my codebase.

I would like to find an expected queue where I can do " co_await my_queue.pop()

". Has anyone implemented one?

More details: I have one producer thread that pushes items to the queue, and another receive thread will wait and wake up when the items are queued. This receiving thread can wait / wake up to process other tasks in the meantime (using pplpp :: when_any).

I do not need a queue with an interface where I have to try out the try_pop method as it is a slow process and I do not want to use the blocking_pop method as it means I cannot process other ready tasks.

+3


source to share


2 answers


This is basically your standard threading implementation of a queue , but instead condition_variable

you will need to use it future

to coordinate different threads. You can then co_await

in the future come back pop

to get ready.

The queue implementation should contain a list of promises that correspond to outstanding calls pop

. If the queue is pop

already full, you can return the finished future immediately. You can use plain old std::mutex

to synchronize concurrent access to underlying data structures.



I don't know of any implementation that already does this, but it shouldn't be too hard to do. Please note that managing all futures will result in additional overhead, so your queue will be slightly less efficient than the classic approach condition_variable

.

+5


source


Posted a comment, but I might as well write this as an answer since its a long time I need formatting.

Basically, you have two options:

Free queues, the most popular of which are the following:

https://github.com/cameron314/concurrentqueue

They have try_pop because it uses an atomic pointer, and any atomic methods (like std :: atomic_compare_exchange_weak) can and will "fail" and return false from time to time, so you have to lock over them.



You can find queues that abstract this inside a "pop" that just calls "try_pop" until it works, but the same overhead in the background.

Blocking queues:

It's easier to do it yourself, without a third-part library, just wrap every method you need in locks if you want to look very often at the use of shared_locks, otherwise just std :: lock_guard should be enough to protect the entire wrapper. However, this is what you might call a "blocking" queue, because during access, weather it has to read or write, the entire queue will be blocked.

There are no thread-like alternatives for these two implementations. If you need a really large queue (for example, hundreds of GB of memory for objects), with heavy use, you might consider writing some kind of custom hybrid data structure, but for most use cases moodycamel will be more than enough.

+1


source







All Articles