[mpi3-coll] Neighborhood collectives round 2: reductions

Jed Brown jedbrown at mcs.anl.gov
Thu Dec 13 13:27:16 CST 2012


On Tue, Dec 11, 2012 at 12:43 AM, Torsten Hoefler <htor at illinois.edu> wrote:

> On Sun, Dec 09, 2012 at 04:55:51PM -0800, Jed Brown wrote:
> >    Understood. May I ask what is the application motivating this routine?
> For the neighbor_reduce, the use-case follows from your earlier emails
> (you asked explicitly for this on the collectives mailinglist in
> CAM9tzSmY38bq3gmYqwB5TS5jOv8uwTU9e526tANHDTYgX4XOBg at mail.gmail.com). The
> neighbor_reducev is a simple generalization and gets closer to what you
> were asking for. I think it's non-trivial to provide identity elements
> in the interface while it's not too hard to just put the data into the
> function (and hopefully not too much overhead).
>

Those use cases (http://lists.mpi-forum.org/mpi3-coll/2011/11/0239.php)
were all dependent on being able to reduce to overlapping targets.

As for defining "identity", the operation I would like is to reduce by
combining with a local buffer (usually in-place destination buffer). That
is, if I have the local buffer

mine = [1.1, 2.1, 3.1, 4.1, 5.1, 6.1]

and vector types for my two neighbors (defined by me)

incoming_type[0] = [0, 3, 4]
incoming_type[1] = [1, 4]

with incoming data (actually getting sent)

incoming_data[0] = [10.2, 20.2, 30.2]
incoming_data[1] = [100.3, 200.3]

the result would be

[op(1.1, 10.2), op(2.1, 100.3), 3.1, op(4.1, 20.2), op(5.1, 30.2, 200.3),
6.1]


This would be a natural expression of the operation I call "SFReduce" in
http://59A2.org/files/StarForest.pdf


> If you now think there is no use-case for any of those (or both)
> functions then we should discontinue the discussion of them (I have
> absolutely no problem with this :-).


I'm not optimistic about their utility without support for reduction into
different/overlapping targets.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpi-forum.org/pipermail/mpiwg-coll/attachments/20121213/49c791be/attachment-0001.html>


More information about the mpiwg-coll mailing list