[Mpi3-rma] nested locks

Jim Dinan james.dinan at gmail.com
Mon Aug 19 15:58:31 CDT 2013


Hi Rajeev,

Re-pasting:

---8<---
pg 437, ln 28: "Distinct access epochs for win at the same process must be
disjoint."
---8<---

I think it's still ok (sorry Jeff, I think this is the same thing you were
just saying).  That sentence states that one cannot have more than a single
access epoch per target concurrently.  I don't think it means that one
cannot have concurrent epochs.

 ~Jim.


On Mon, Aug 19, 2013 at 4:49 PM, Rajeev Thakur <thakur at mcs.anl.gov> wrote:

> Those are two access epochs on the same rank (say rank 0), as defined by
> the paragraph above that sentence (pg 437, ln 28).
>
> On Aug 19, 2013, at 2:52 PM, Jeff Hammond wrote:
>
> > you mean lock at those ranks, not by those ranks?  those aren't
> conflicting so i don't see the issue.  maybe i am a dolt though.
> >
> > jeff
> >
> >
> > On Mon, Aug 19, 2013 at 2:44 PM, Rajeev Thakur <thakur at mcs.anl.gov>
> wrote:
> > Did we make any change in MPI 3.0 to allow nesting of MPI_Win_lock
> calls? For example,
> >
> > Win_lock(rank 1)
> >     Win_lock(rank 2)
> >
> >     Win_unlock(rank 2)
> > Win_unlock(rank 1)
> >
> > I can't find the text we added or deleted to allow this.
> >
> > But I do see text that disallows this:
> > pg 437, ln 28: "Distinct access epochs for win at the same process must
> be disjoint."
> >
> > Rajeev
> >
> >
> > _______________________________________________
> > mpi3-rma mailing list
> > mpi3-rma at lists.mpi-forum.org
> > http://lists.mpi-forum.org/mailman/listinfo.cgi/mpi3-rma
> >
> >
> >
> > --
> > Jeff Hammond
> > jeff.science at gmail.com
> > _______________________________________________
> > mpi3-rma mailing list
> > mpi3-rma at lists.mpi-forum.org
> > http://lists.mpi-forum.org/mailman/listinfo.cgi/mpi3-rma
>
>
> _______________________________________________
> mpi3-rma mailing list
> mpi3-rma at lists.mpi-forum.org
> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpi3-rma
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpi-forum.org/pipermail/mpiwg-rma/attachments/20130819/7baf66db/attachment-0001.html>


More information about the mpiwg-rma mailing list