parallelism

1. parallel processing.

<parallel>

2. The maximum number of independent subtasks in a given task at a given point in its execution. E.g. in computing the expression

 (a + b) *

(c + d) the expressions a, b, c and d can all be calculated in parallel giving a degree of parallelism of (at least) four. Once they have been evaluated then the expressions a + b and c + d can be calculated as two independent parallel processes.

The Bernstein condition states that processes P and Q can be executed in parallel (or in either sequential order) only if:

(i) there is no overlap between the inputs of P and the outputs of Q and vice versa and

(ii) there is no overlap between the outputs of P, the outputs of Q and the inputs of any other task.

If process P outputs value v which process Q reads then P must be executed before Q. If both processes write to some variable then its final value will depend on their execution order so they cannot be executed in parallel if any other process depends on that variable's value.

Last updated: 1995-05-07

Nearby terms:

Parallel FortranParallel HaskellparallelismParallel Pascalparallel port

Try this search on Wikipedia, Wiktionary, Google, OneLook.



Loading