In a reduce, each data item appearing onto the output stream is
splitted into its basic components, and all the components are
"summed-up" by using a binary associative and commutative function
g. In the simplest case (the one currently implemented in ocamlp3l, the
reduce skeleton "sums-up" all the elements of a vector by using a
commutative and associtaive function (commutativity and associativity are
to be guaranteed by the user providing the function!)
Functionally reduce f has type
'a vector stream -> 'a stream
provided that the function f has type
'a -> 'a.
Given the input stream :xn:...:x1:x0: the reduce reduce(f)
computes the output stream
:(fold f xn):...:(fold f x1):(fold f x0):
where we have let rec fold f = function [x] -> x | x::rx -> f(x (fold f rx));;
In terms of (parallel) processes, a vector data item appearing onto
the input stream of a map is processed by a logical tree of processes. each
one of the processes is able to compute the fuction g. The resulting process
network looks like the following:
Have a look at the reduce behaviour, in
terms of data items processed.
Back to the skeleton set page.