<div dir="ltr">This ticket is now <a href="https://github.com/mpi-forum/mpi-issues/issues/31">https://github.com/mpi-forum/mpi-issues/issues/31</a>.</div><div class="gmail_extra"><br><div class="gmail_quote">On Mon, Jan 11, 2016 at 6:33 PM, Jim Dinan <span dir="ltr"><<a href="mailto:james.dinan@gmail.com" target="_blank">james.dinan@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">Do we also want to pick back up the address space query routine?  I think the latest proposal we looked at is MPI_COMM_TYPE_ADDRESS_SPACE (<a href="https://svn.mpi-forum.org/trac/mpi-forum-web/ticket/485" target="_blank">https://svn.mpi-forum.org/trac/mpi-forum-web/ticket/485</a>).</div><div class="HOEnZb"><div class="h5"><div class="gmail_extra"><br><div class="gmail_quote">On Mon, Jan 11, 2016 at 12:26 PM, Balaji, Pavan <span dir="ltr"><<a href="mailto:balaji@anl.gov" target="_blank">balaji@anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Steve, Dan,<br>
<br>
Here's the change that we discussed at the last meeting.<br>
<br>
In section 2.7 first paragraph, change the last sentence:<br>
<br>
from<br>
<br>
"Typically, each process executes in its own address space, although shared-memory implementations of MPI are possible."<br>
<br>
to<br>
<br>
"Typically, each MPI process executes in its own address space, although it is possible for multiple MPI processes to execute within a shared address space."<br>
<br>
  -- Pavan<br>
<div><div><br>
> On Jan 11, 2016, at 10:59 AM, Daniel Holmes <<a href="mailto:dholmes@epcc.ed.ac.uk" target="_blank">dholmes@epcc.ed.ac.uk</a>> wrote:<br>
><br>
> Hi,<br>
><br>
> The "Click this link to join the webex at the meeting time" instruction on the new github wiki is plain text.<br>
><br>
> The link address (copied from the old trac wiki) is:<br>
> <a href="https://cisco.webex.com/ciscosales/j.php?ED=236535652&UID=0&PW=NOGE0NDk5MmVh&RT=MiMxMQ%3D%3D" rel="noreferrer" target="_blank">https://cisco.webex.com/ciscosales/j.php?ED=236535652&UID=0&PW=NOGE0NDk5MmVh&RT=MiMxMQ%3D%3D</a><br>
><br>
> Cheers,<br>
> Dan.<br>
><br>
> On 11/01/2016 16:43, Balaji, Pavan wrote:<br>
>> I thought we did, but Jim's email confused me now.  I'll call in anyway and see if others join in.<br>
>><br>
>>  -- Pavan<br>
>><br>
>> Sent from my iPhone<br>
>><br>
>><br>
>>> On Jan 11, 2016, at 10:38 AM, Steven Oyanagi <<a href="mailto:sko@cray.com" target="_blank">sko@cray.com</a>><br>
>>>  wrote:<br>
>>><br>
>>> Hi,<br>
>>><br>
>>> Are we having a meeting today?  I don’t think I have seen an e-mail since<br>
>>> the December face-to-face meeting.<br>
>>><br>
>>> If we do meet I would like to discuss some suggestions Ryan Grant from<br>
>>> Sandia had about how to get the possible “change ‘process’ to something<br>
>>> else everywhere in the standard” modification through the Forum.<br>
>>>    - Steve<br>
>>><br>
>>> -----Original Message-----<br>
>>> From: mpiwg-hybridpm<br>
>>> <<a href="mailto:mpiwg-hybridpm-bounces@lists.mpi-forum.org" target="_blank">mpiwg-hybridpm-bounces@lists.mpi-forum.org</a>><br>
>>>  on<br>
>>> behalf of Jim Dinan<br>
>>> <<a href="mailto:james.dinan@gmail.com" target="_blank">james.dinan@gmail.com</a>><br>
>>><br>
>>> Reply-To:<br>
>>> "<a href="mailto:mpiwg-hybridpm@lists.mpi-forum.org" target="_blank">mpiwg-hybridpm@lists.mpi-forum.org</a>"<br>
>>> <<a href="mailto:mpiwg-hybridpm@lists.mpi-forum.org" target="_blank">mpiwg-hybridpm@lists.mpi-forum.org</a>><br>
>>><br>
>>> Date: Monday, November 2, 2015 at 10:52 AM<br>
>>> To:<br>
>>> "<a href="mailto:mpiwg-hybridpm@lists.mpi-forum.org" target="_blank">mpiwg-hybridpm@lists.mpi-forum.org</a>"<br>
>>> <<a href="mailto:mpiwg-hybridpm@lists.mpi-forum.org" target="_blank">mpiwg-hybridpm@lists.mpi-forum.org</a>><br>
>>><br>
>>> Subject: [mpiwg-hybridpm] Hybrid WG Hiatus Until January<br>
>>><br>
>>><br>
>>>> Hi All,<br>
>>>><br>
>>>><br>
>>>> The hybrid WG members are planning to suspend meetings until January to<br>
>>>> focus on forward progress for other MPI Forum activities.<br>
>>>><br>
>>>><br>
>>>> Look for email following the December meeting with details.<br>
>>>><br>
>>>><br>
>>>> ~Jim.<br>
>>>><br>
>>> _______________________________________________<br>
>>> mpiwg-hybridpm mailing list<br>
>>><br>
>>> <a href="mailto:mpiwg-hybridpm@lists.mpi-forum.org" target="_blank">mpiwg-hybridpm@lists.mpi-forum.org</a><br>
>>> <a href="http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-hybridpm" rel="noreferrer" target="_blank">http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-hybridpm</a><br>
>> _______________________________________________<br>
>> mpiwg-hybridpm mailing list<br>
>><br>
>> <a href="mailto:mpiwg-hybridpm@lists.mpi-forum.org" target="_blank">mpiwg-hybridpm@lists.mpi-forum.org</a><br>
>> <a href="http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-hybridpm" rel="noreferrer" target="_blank">http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-hybridpm</a><br>
>><br>
>><br>
>><br>
><br>
> --<br>
> Dan Holmes<br>
> Applications Consultant in HPC Research<br>
> EPCC, The University of Edinburgh<br>
> James Clerk Maxwell Building<br>
> The Kings Buildings<br>
> Peter Guthrie Tait Road<br>
> Edinburgh<br>
> EH9 3FD<br>
> T: <a href="tel:%2B44%280%29131%20651%203465" value="+441316513465" target="_blank">+44(0)131 651 3465</a><br>
> E:<br>
> <a href="mailto:dholmes@epcc.ed.ac.uk" target="_blank">dholmes@epcc.ed.ac.uk</a><br>
><br>
><br>
> *Please consider the environment before printing this email.*<br>
><br>
</div></div>> The University of Edinburgh is a charitable body, registered in<br>
> Scotland, with registration number SC005336.<br>
<div><div>> _______________________________________________<br>
> mpiwg-hybridpm mailing list<br>
> <a href="mailto:mpiwg-hybridpm@lists.mpi-forum.org" target="_blank">mpiwg-hybridpm@lists.mpi-forum.org</a><br>
> <a href="http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-hybridpm" rel="noreferrer" target="_blank">http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-hybridpm</a><br>
<br>
_______________________________________________<br>
mpiwg-hybridpm mailing list<br>
<a href="mailto:mpiwg-hybridpm@lists.mpi-forum.org" target="_blank">mpiwg-hybridpm@lists.mpi-forum.org</a><br>
<a href="http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-hybridpm" rel="noreferrer" target="_blank">http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-hybridpm</a><br>
</div></div></blockquote></div><br></div>
</div></div><br>_______________________________________________<br>
mpiwg-hybridpm mailing list<br>
<a href="mailto:mpiwg-hybridpm@lists.mpi-forum.org">mpiwg-hybridpm@lists.mpi-forum.org</a><br>
<a href="http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-hybridpm" rel="noreferrer" target="_blank">http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-hybridpm</a><br></blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature">Jeff Hammond<br><a href="mailto:jeff.science@gmail.com" target="_blank">jeff.science@gmail.com</a><br><a href="http://jeffhammond.github.io/" target="_blank">http://jeffhammond.github.io/</a></div>
</div>