Task Schedulers
Task schedulers are represented by the System.Threading.Tasks.TaskScheduler class. A task scheduler makes sure that the work of a task is eventually executed. The default task scheduler is based on the .NET Framework 4 ThreadPool, which provides work-stealing for load-balancing, thread injection/retirement for maximum throughput, and overall good performance. It should be sufficient for most scenarios. However, if you require special functionality, you can create a custom scheduler and enable it for specific tasks or queries. For more information about how to create and use a custom task scheduler, see How to: Create a Task Scheduler That Limits the Degree of Concurrency. For additional examples of custom schedulers, see Parallel Extensions Samples on the MSDN Code Gallery Web site.
Default Task Scheduler and the ThreadPool
The default scheduler for Task Parallel Library and PLINQ uses the .NET Framework ThreadPool to queue and execute work. In the .NET Framework 4, the ThreadPool uses the information that is provided by the System.Threading.Tasks.Task type to efficiently support the fine-grained parallelism (short-lived units of work) that parallel tasks and queries often represent.
ThreadPool Global Queue vs. Local Queues
As in earlier versions of the .NET Framework, the ThreadPool maintains a global FIFO (first-in, first-out) work queue for threads in each application domain. Whenever a program calls QueueUserWorkItem (or UnsafeQueueUserWorkItem), the work is put on this shared queue and eventually de-queued onto the next thread that becomes available. In the .NET Framework 4, this queue has been improved to use a lock-free algorithm that resembles the ConcurrentQueue class. By using this lock-free implementation, the ThreadPool spends less time when it queues and de-queues work items. This performance benefit is available to all programs that use the ThreadPool.
Top-level tasks, which are tasks that are not created in the context of another task, are put on the global queue just like any other work item. However, nested or child tasks, which are created in the context of another task, are handled quite differently. A child or nested task is put on a local queue that is specific to the thread on which the parent task is executing. The parent task may be a top-level task or it also may be the child of another task. When this thread is ready for more work, it first looks in the local queue. If work items are waiting there, they can be accessed quickly. The local queues are accessed on last-in, first-out order (LIFO) in order to preserve cache locality and reduce contention. For more information about child tasks and nested tasks, see Nested Tasks and Child Tasks.
The following example shows some tasks that are scheduled on the global queue and other tasks that are scheduled on the local queue.
Sub QueueTasks()
' TaskA is a top level task.
Dim taskA = Task.Factory.StartNew(Sub()
Console.WriteLine("I was enqueued on the thread pool's global queue.")
' TaskB is a nested task and TaskC is a child task. Both go to local queue.
Dim taskB = New Task(Sub() Console.WriteLine("I was enqueued on the local queue."))
Dim taskC = New Task(Sub() Console.WriteLine("I was enqueued on the local queue, too."),
TaskCreationOptions.AttachedToParent)
taskB.Start()
taskC.Start()
End Sub)
End Sub
void QueueTasks()
{
// TaskA is a top level task.
Task taskA = Task.Factory.StartNew( () =>
{
Console.WriteLine("I was enqueued on the thread pool's global queue.");
// TaskB is a nested task and TaskC is a child task. Both go to local queue.
Task taskB = new Task( ()=> Console.WriteLine("I was enqueued on the local queue."));
Task taskC = new Task(() => Console.WriteLine("I was enqueued on the local queue, too."),
TaskCreationOptions.AttachedToParent);
taskB.Start();
taskC.Start();
});
}
The use of local queues not only reduces pressure on the global queue, it also takes advantage of data locality. Work items in the local queue frequently reference data structures that are physically near to one another in memory. In these cases, the data is already in the cache after the first task has run, and can be accessed quickly. Both Parallel LINQ (PLINQ) and the Parallel class use nested tasks and child tasks extensively, and achieve significant speedups by using the local work queues.
Work Stealing
The .NET Framework 4 ThreadPool also features a work-stealing algorithm to help make sure that no threads are sitting idle while others still have work in their queues. When a thread-pool thread is ready for more work, it first looks at the head of its local queue, then in the global queue, and then in the local queues of other threads. If it finds a work item in the local queue of another thread, it first applies heuristics to make sure that it can run the work efficiently. If it can, it de-queues the work item from the tail (in FIFO order). This reduces contention on each local queue and preserves data locality. This architecture helps the .NET Framework 4 ThreadPool load-balance work more efficiently than past versions did.
Long-Running Tasks
You may want to explicitly prevent a task from being put on a local queue. For example, you may know that a particular work item will run for a relatively long time and is likely to block all other work items on the local queue. In this case, you can specify the LongRunning option, which provides a hint to the scheduler that an additional thread might be required for the task so that it does not block the forward progress of other threads or work items on the local queue. By using this option you avoid the ThreadPool completely, including the global and local queues.
Task Inlining
In some cases, when a Task is waited on, it may be executed synchronously on the Thread that is performing the wait operation. This enhances performance, as it prevents the need for an additional Thread by utilizing the existing Thread which would have blocked, otherwise. To prevent errors due to re-entrancy, task inlining only occurs when the wait target is found in the relevant Thread's local queue.
Specifying a Synchronization Context
You can use the TaskScheduler.FromCurrentSynchronizationContext method to specify that a task should be scheduled to run on a particular thread. This is useful in frameworks such as Windows Forms and Windows Presentation Foundation where access to user interface objects is often restricted to code that is running on the same thread on which the UI object was created. For more information, see How to: Schedule Work on a Specified Synchronization Context.
See Also
Tasks
How to: Create a Task Scheduler That Limits the Degree of Concurrency
Reference
Concepts
How to: Schedule Work on a Specified Synchronization Context