<html><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; ">
<div style="margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 37px; text-indent: -37px; "><font face="Helvetica" size="3" color="#000000" style="font: 12.0px Helvetica; color: #000000"><b>From: </b></font><font face="Helvetica" size="3" style="font: 12.0px Helvetica">Richard Treumann <<a href="mailto:treumann@us.ibm.com">treumann@us.ibm.com</a>></font></div><div style="margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 34px; text-indent: -34px; "><font face="Helvetica" size="3" color="#000000" style="font: 12.0px Helvetica; color: #000000"><b>Date: </b></font><font face="Helvetica" size="3" style="font: 12.0px Helvetica">January 29, 2008 10:38:02 AM CST</font></div><div style="margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 21px; text-indent: -21px; "><font face="Helvetica" size="3" color="#000000" style="font: 12.0px Helvetica; color: #000000"><b>To: </b></font><font face="Helvetica" size="3" style="font: 12.0px Helvetica">"Mailing list for discussion of MPI 2.1" <<a href="mailto:mpi-21@cs.uiuc.edu">mpi-21@cs.uiuc.edu</a>></font></div><div style="margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 50px; text-indent: -50px; "><font face="Helvetica" size="3" color="#000000" style="font: 12.0px Helvetica; color: #000000"><b>Subject: </b></font><font face="Helvetica" size="3" style="font: 12.0px Helvetica"><b>Re: [mpi-21] MPI_FINALIZE</b></font></div><div style="margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 0px; min-height: 14px; "><br></div><div style="margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 0px; min-height: 14px; "><br></div><p>I do not see a need for clarification.<br> <br> As long as somewhere in the standard the following points are clear:<br> 1) Only MPI_BARRIER promises barrier behavior<br> 2) Any collective may be implemented with barrier behavior as a side effect<br> <br> (performance issues make some collectives (eg. MPI_Bcast) unlikely to be like a barrier like but the standard does not rule it out)<br> <br> Dick <br> <br> <br> <br> Dick Treumann - MPI Team/TCEM <br> IBM Systems & Technology Group<br> Dept 0lva / MS P963 -- 2455 South Road -- Poughkeepsie, NY 12601<br> Tele (845) 433-7846 Fax (845) 433-8363<br> <br> <br><tt>> Mainly to Jeff Squyres, Hubert Ritzdorf, Rolf Rabenseifner, Nicholas Nevin,<br> > Bill Gropp, Dick Treumann <br> > <br> > This is a follow up to:<br> > MPI_FINALIZE in MPI-2 (with spawn) <br> > in </tt><tt><a href="http://www.cs.uiuc.edu/homes/wgropp/projects/parallel/MPI/mpi-">http://www.cs.uiuc.edu/homes/wgropp/projects/parallel/MPI/mpi-</a></tt><tt><br> > errata/index.html<br> > with mail discussion in<br> > </tt><tt><a href="http://www.cs.uiuc.edu/homes/wgropp/projects/parallel/MPI/mpi-">http://www.cs.uiuc.edu/homes/wgropp/projects/parallel/MPI/mpi-</a></tt><tt><br> > errata/discuss/finalize/<br> > _________________________________________<br> > <br> > When I understand correctly, then a clarification is not needed<br> > because the MPI standard expresses all, i.e.<br> > - MPI_Finalize need not to behave like a barrier <br> > - but is allowed to have a barrier inside.<br> > - If the user wants to exit one spawned process while<br> > the others still continue to work, he/she must<br> > disconnect this process before calling MPI_Finalize on it.<br> > <br> > If somebody wants a clarification to be included into the standard<br> > and therefore in Ballot 4, please send me your wording<br> > with the page and line references included.<br> > <br> > If all agree, that no clarification is needed, then I would finish <br> > this track.<br> > <br> > Best regards<br> > Rolf<br> > <br> > <br> > Dr. Rolf Rabenseifner . . . . . . . . . .. email <a href="mailto:rabenseifner@hlrs.de">rabenseifner@hlrs.de</a><br> > High Performance Computing Center (HLRS) . phone ++49(0)711/685-65530<br> > University of Stuttgart . . . . . . . . .. fax ++49(0)711 / 685-65832<br> > Head of Dpmt Parallel Computing . . . <a href="http://www.hlrs.de/people/rabenseifner">www.hlrs.de/people/rabenseifner</a><br> > Nobelstr. 19, D-70550 Stuttgart, Germany . (Office: Allmandring 30)<br> > _______________________________________________<br> > mpi-21 mailing list<br> > <a href="mailto:mpi-21@cs.uiuc.edu">mpi-21@cs.uiuc.edu</a><br> > </tt><tt><a href="http://lists.cs.uiuc.edu/mailman/listinfo/mpi-21">http://lists.cs.uiuc.edu/mailman/listinfo/mpi-21</a></tt><tt><br> </tt></p></body></html>