[mpiwg-hybridpm] First draft of Continuations section
Joseph Schuchart
schuchart at icl.utk.edu
Wed Aug 4 13:02:49 CDT 2021
Thanks Jeff! I am less worried about the out-of-band update of the
status objects but the whole thing with callbacks and function pointers.
The only place in the MPI standard using callbacks right now is the
tools chapter and that is explicitly C-only. Looking at
https://scicomp.stackexchange.com/a/286 it seems to be possible to pass
around function pointers in Fortran but the comments suggest that its
not compatible with C?
Thanks
Joseph
On 8/4/21 1:00 PM, Jeff Hammond wrote:
> Regarding Fortran…
>
> In theory, you need ASYNCHRONOUS/VOLATILE attributes on visible state that’s updated out of band.
>
> In practice, MPI implementations have done this with nonblocking send-recv for 25 years and it hasn’t been an issue.
>
> Sent from my iPhone
>
>> On Aug 4, 2021, at 7:36 PM, Joseph Schuchart via mpiwg-hybridpm <mpiwg-hybridpm at lists.mpi-forum.org> wrote:
>>
>> Dear all,
>>
>> I have written and attached a first draft of the continuations section that I would like to discuss in one of the upcoming meetings. Right now it is embedded at the end Section 3 (Point-to-Point following test and wait etc) where I felt it would fit well, I'm open to other suggestions though.
>>
>> There are still some open TODOs that I'd like to discuss. In particular, I'm grappling with the handling of statuses. As suggested during previous discussions, I removed the flag argument to MPI_Continue such that the MPI implementation is allowed to invoke the continuations directly (can be controlled through an info key though). That means that after the call to MPI_Continue there is no good reason to inspect the statuses anymore. However, we may still want the user to provide the status object(s) in order to avoid exposing implementation-internal memory in the callback. But then I am worried that this is a source of error, with users providing pointers to stack memory that goes out of scope before the continuation is invoked.
>>
>> Also, I am unsure whether and how this whole mechanism works with Fortran. The python generator happily generates the interface but I have no idea whether that is correct...
>>
>> Looking forward to discussing this on the call!
>>
>> Cheers
>> Joseph
>> <mpi40-report-continuations.pdf>
>> _______________________________________________
>> mpiwg-hybridpm mailing list
>> mpiwg-hybridpm at lists.mpi-forum.org
>> https://lists.mpi-forum.org/mailman/listinfo/mpiwg-hybridpm
More information about the mpiwg-hybridpm
mailing list