CSC/ECE 506 Spring 2012/3a kp: Difference between revisions

From Expertiza_Wiki
Jump to navigation Jump to search
Line 17: Line 17:
We cover three "Parallel Pattern Programming Languages" and there commonalities.
We cover three "Parallel Pattern Programming Languages" and there commonalities.


1. A classification developed by B.L.Massingell considers four levels of patterns -
1. A classification developed by B.L.Massingell[ http://www.cise.ufl.edu/research/ParallelPatterns/overview.htm link] considers four levels of patterns -
A) FindingConcurrency  
A) FindingConcurrency  
B) AlgorithmStructure
B) AlgorithmStructure
Line 28: Line 28:
iii) Design Evaluation – This pattern is a consolidation pattern; it is used to evaluate the results of the other patterns in this design space.  
iii) Design Evaluation – This pattern is a consolidation pattern; it is used to evaluate the results of the other patterns in this design space.  


B)
B) Algorithm Structure – This steps moves close to the final program and has six Patterns in three sub-groups -
i) Organize by flow of data – There are two main patterns in this group – PipelineProcessing – problem is decomposed into ordered groups of tasks connected by data dependencies.
AsynchronousComposition – The problem is decomposed into groups of tasks that interact through asynchronous events.
ii) Organize by tasks –
-EmbarrassinglyParallel –The problem is decomposed into a set of independent tasks.
- Divide and conquer – Break the problem into subproblems that are solved separately and then recombined later.

Revision as of 03:33, 10 February 2012

Patterns of parallel programming

1. Overview

A design pattern is a description of a solution to a frequently occurring problem in some domain. The description makes it easy for the reader to quickly understand both the problem and the proposed solution. Because the pattern has a name, a collection of patterns provides a vocabulary with which to talk about these solutions.

As an example – “Dense Linear Algebra” is a “Computational” design pattern for parallel programming. Under this pattern a computation is organized as a sequence of arithmetic expressions acting on dense arrays of data. The operations and data access patterns are well defined mathematically so data can be prefetched and CPUs execute close to their theoretically allowed peak performance.

Parallel programming design patterns have been identified that are applicable at various levels of software development from high-level patterns that describe how an application is organized to midlevel patterns about specific classes of computations, and low level patterns describing specific execution strategies. Related to these classifications, there are also available “pattern languages” from several researchers, that guide the software developer through hierarchy of parallel programming design development. (http://www.cise.ufl.edu/research/ParallelPatterns http://parlab.eecs.berkeley.edu/wiki/patterns/patterns).

In this article we first cover two/three of “pattern languages” for parallel programming in some detail. These are attempts to classify and we then dwell on commonalities of these approaches. Thereafter, we cover a broad range of parallel programming design patterns highlighting the commonalities and differences.

It is important to appreciate that many design problems do not lend themselves to a top-down or bottom-up analysis. In many cases, the pathway through our patterns will be bounce around between layers with the designer working at whichever layer is most productive at a given time.

2. Parallel Pattern Programming Languages

We cover three "Parallel Pattern Programming Languages" and there commonalities.

1. A classification developed by B.L.Massingell[ http://www.cise.ufl.edu/research/ParallelPatterns/overview.htm link] considers four levels of patterns - A) FindingConcurrency B) AlgorithmStructure C) SupportingStructures D) ImplementationMechanisms

A) Under "FindingConcurrency” the designer focuses on high-level issues to expose potential concurrency. There are three major design patterns at this level – i) Decomposition Strategy-This design pattern addresses the question of how to decompose the problem into parts that can be run simultaneously. This has related design pattern “Task Decomposition” (decomposing into concurrent tasks) and “Data Decomposition” (decomposing into data units that can be run independently). ii) Dependency Analysis – Once the problem is broken by Decomposition Strategy into entities, the dependency analysis pattern help the programmer understand how they depend on each other. These dependencies include ordering constraints and data dependencies that includes patterns - “GroupTasks” (grouping tasks to simplify dependency), “OrderTasks” (ordering tasks to satisfy constraints) and “DataSharing” (sharing data among tasks). iii) Design Evaluation – This pattern is a consolidation pattern; it is used to evaluate the results of the other patterns in this design space.

B) Algorithm Structure – This steps moves close to the final program and has six Patterns in three sub-groups - i) Organize by flow of data – There are two main patterns in this group – PipelineProcessing – problem is decomposed into ordered groups of tasks connected by data dependencies. AsynchronousComposition – The problem is decomposed into groups of tasks that interact through asynchronous events. ii) Organize by tasks – -EmbarrassinglyParallel –The problem is decomposed into a set of independent tasks. - Divide and conquer – Break the problem into subproblems that are solved separately and then recombined later.