[Mpi3-hybridpm] [MPI Forum] #208: MPI3 Hybrid Programming: multiple endpoints per collective, option 1

MPI Forum mpi-22 at lists.mpi-forum.org
Thu Jan 7 07:20:39 CST 2010

#208: MPI3 Hybrid Programming: multiple endpoints per collective, option 1
                    Reporter:  dougmill@…                 |                       Owner:  dougmill@…             
                        Type:  Enhancements to standard   |                      Status:  new                    
                    Priority:  Not ready / author rework  |                   Milestone:  2010/01/19 Atlanta, USA
                     Version:  MPI 3.0                    |                    Keywords:                         
              Implementation:  Unnecessary                |           Author_bill_gropp:  0                      
          Author_rich_graham:  0                          |           Author_adam_moody:  0                      
      Author_torsten_hoefler:  0                          |        Author_dick_treumann:  0                      
Author_jesper_larsson_traeff:  0                          |       Author_george_bosilca:  0                      
           Author_david_solt:  0                          |   Author_bronis_de_supinski:  0                      
        Author_rajeev_thakur:  0                          |         Author_jeff_squyres:  0                      
    Author_alexander_supalov:  0                          |    Author_rolf_rabenseifner:  0                      

Comment(by dougmill@…):

 If the compute/communicate pattern was not (otherwise) synchronized
 between the threads, then the program could not do horizontal parallelism
 this way. If each agent were truly independent, and the
 communication/computation in one agent did not depend on that of other
 agents, then the only way get horizontal parallelism would be to add more
 threads to each endpoint. It is not clear whether multiple methods for
 horizontal parallelism should be offered - e.g. could a program use either
 join/leave or multiple-attach (or even both at different places in the

Ticket URL: <https://svn.mpi-forum.org/trac/mpi-forum-web/ticket/208#comment:5>
MPI Forum <https://svn.mpi-forum.org/>
MPI Forum

More information about the mpiwg-hybridpm mailing list