Ed Essey, a Program Manager on the Microsoft Parallel Computing team, wrote on the latest enhancements to PLINQ that are to appear in .NET 4.0 Beta 1 that is to be released soon. Some of them are: “With” Operators Pattern, Execution Mode, Cancellation, Refactoring, Performance Improvements.
The complete list of PLINQ enhancements in Beta 1 is:
- With- Operators Pattern
- Execution Mode
- Cancellation
- Custom Partitioning
- Refactoring
- Merge Options
- AsMerged Renamed Back to AsSequential
- Binary Operators Now Require AsParallel on Both Sides
- Performance Improvements
- Removed Seldom Used Operators
“With” Operators Pattern. There are 4 new methods:
- e.AsParallel().WithDegreeOfParallelism
- e.AsParallel().WithExecutionMode
- e.AsParallel().WithCancellation
- e.AsParallel().WithMergeOptions
Execution Mode. PLINQ was tuned to consume similar resources to a LINQ-to-Objects query, especially related to memory consumption. When a PLINQ call is considered to consume a lot more resources, the call is executed sequentially instead of in parallel. The decision to switch to a sequential execution is made based on query’s shape. These queries are executed sequentially:
- Queries contain indexed Select, indexed Where, indexed SelectMany, or ElementAt in a position where the indices are no longer in the original order. Index ordering is senstive to operators that change ordering (e.g. OrderBy) and operators that remove elements (e.g. Where).
- Queries that contain operators Take, TakeWhile, Skip, SkipWhile when indices are not in the original order (see above bullet).
- Queries that contain Zip, SequenceEquals, unless one of the data sources has an originally ordered index and the other data source is indexible (i.e. an array or IList<T>).
- Queries that contain Concat, unless it is applied to indexible data sources.
- Queries that contain Reverse, unless applied to an indexible data source.
To force parallel execution one can use:
e.AsParallel().WithExecutionMode(ParallelExecutionMode.ForceParallelism)
Cancellation. Parallel operations can be cancelled as this example shows:
var cts = new CancellationTokenSource();
var q = a.AsParallel().WithCancellation(cts.Token).Where(x=>Filter(x)).Select(x=>DoWork(x);
-- separate thread --
foreach (var e in q) { … } // Statement 1
-- separate thread --
var l = q.ToList(); // Statement 2
-- separate thread --
cts.Cancel(); // this will attempt to cancel any in-flight queries,
// including both statements 1 and 2
Custom Partitioning. The classes Partitioner<TSource>, OrderablePartitioner<TSource> and the factory class Partitioner offer control on how data is partitioned.
Refactoring. The interfaces IParallelEnumerable, IParallelEnumerable<T>, and IParallelOrderedEnumerable<T> are no longer interfaces but abstract classes that cannot be extended: ParalellQuery, ParalellQuery<TSource>, and OrderedParallelQuery<TSource>. The reason is that they were not intended to be extended in the first place.
Merge Options. “Handling of ParallelMergeOptions has been moved out of AsMerged. Merge buffering is now specified via the WithMergeOptions method.”
AsMerged. AsMerged has been renamed AsSequential for similarity with AsParallel as it used to be.
Binary Operators. LINQ operators with 2 data sources need AsParallel on both sides. Operations like:
a.AsParallel().AsOrdered().Zip(b, (x, y) => x*y);
are to be parallelized as:
a.AsParallel().AsOrdered().Zip(b.AsParallel(), (x, y) => x*y);
or
a.AsParallel().AsOrdered().Zip(b.AsParallel().AsOrdered(), (x, y) => x*y);
Operators affected: Zip, Join, GroupJoin, Concat, SequenceEqual, Union, Intersect, Except.
Performance Improvements.
1. Order-preserving pipelining merges – Previously, just putting AsOrdered on a query forced the entire query to execute before a single element is yielded. This is now optimized so that elements from the query can be yielded as they are produced with Default (AutoBuffered) and NotBuffered values of MergeOptions.
2. Improved partitioning fairness for data sources that don't implement IList<T>
3. Better performance of some queries over IList<T> or an array
4. Chunk partitioning size tuning - Chunk partitioning is the most common partitioning scheme for queries over data sources other that IList<T> and arrays (i.e. non-indexible data sources). The sizes of these chunks now grow as more and more chunks are accessed. This is to balance between the cases of a) queries with small data sets but expensive delegates in the queries and b) queries with large data sets but inexpensive delegates in the queries.
5. Removal of likely false sharing cases, which have had up to 6x improvements in some cases
Removed Seldom Used Operators. Some operators that were created for performance reasons but did not offer any performance over LINQ were removed. It was not specified which operators were removed.