Why does the thread pool manage threads this way?

Joe Albahari provides a great explanation on the .NET Thread Pool thread management and why it works the way it does in his Threading in the C # e-book.

From what I understand, by default, after filling all processor cores, the thread pool delays the creation of new threads, because if all processor cores are busy with computation, the creation of new threads can no longer improve the overall throughput (tasks executed per second) applications, and new threads are just a waste of system resources.

However, if a task has been in the thread pool queue for too long, the thread pool assumes that the thread pool is idle or blocked in some way and tries to take advantage of the idle time while executing the task at the same time.

Instead of this "latency" algorithm, in many cases it doesn't make sense to use a technique in which the thread pool threads have a special property that signals a "wait state"? It might look something like this:

System.Threading.Thread.CurrentThread.IsWaiting = true;

      

The thread pool will create new threads instantly for queued tasks until all processor cores are occupied by non-waiting threads. Tasks are then kept in a queue until none of them complete, OR signaling a pending state.

This would bring a couple of benefits. First, if the processor core is idle, tasks are always started the moment they are queued to the pool, without delay. Second, in an application that does many computationally intensive tasks that take more than half a second to complete, the thread pool will not continue to burden the system with unnecessary additional threads.

Of course, there may be situations in which an application must complete tasks on a strict deadline and cannot wait for other tasks to complete. This algorithm may not work for these applications. Otherwise I guess it will improve the performance of multithreaded applications.

What do you think?

+3


source to share


1 answer


We have this information available in the property Thread.ThreadState

. But it would be nice if the thread pool used this information. In order to use it, we need communication between threads (those that are part of the thread pool and another collection of information). This will mean some synchronization needs, or at least unstable access. Both are really expensive. Thus, we would provide runtime load to all ThreadPool applications, while only a few of them would benefit.

As a programmer, you need to think about how the thread pool is used. If the default behavior doesn't suit you, you can configure the pool. For example. using ThreadPool.SetMinThreads

if you know you have many threads pending. It wouldn't be as automatic as you wish. But your automation wouldn't be perfect either, as we could easily start too many threads when some of the waiting threads wake up at the same time.



Note that other thread pools do not have a smart enough heurisitc extension at all, which is built into the C # variant. You usually have a fixed number of threads running, and you will never have more than that number.

0


source







All Articles