[mpiwg-tools] Intel MPI Backend Breakpoint

rhc at open-mpi.org rhc at open-mpi.org
Fri Jul 14 01:14:02 CDT 2017


We will deprecate for v3.1 (expected out this fall), and may phase it out sometime in 2018 with the release of OMPI 4.0, or maybe as late as 2019. No real schedule has been developed yet. We are just trying to provide folks like you with as much notice as possible. You should plan on at least one year to get ready.

> On Jul 13, 2017, at 9:03 AM, John DelSignore <John.DelSignore at roguewave.com> wrote:
> 
> Ouch. Have you decided what the deprecation time line looks like yet? In other words, when do you think that Open MPI will stop supporting MPIR?
> Cheers, John D.
> 
> On 07/13/17 08:00, Jeff Squyres (jsquyres) wrote:
>> FWIW, we just decided this week in Open MPI to deprecate the MPIR interface in favor of PMIx.
>> 
>> 
>>> On Jul 12, 2017, at 2:02 PM, Durnov, Dmitry <dmitry.durnov at intel.com> <mailto:dmitry.durnov at intel.com> wrote:
>>> 
>>> Sure.
>>> 
>>> Thanks.
>>> 
>>> BR,
>>> Dmitry
>>> 
>>> -----Original Message-----
>>> From: mpiwg-tools [mailto:mpiwg-tools-bounces at lists.mpi-forum.org <mailto:mpiwg-tools-bounces at lists.mpi-forum.org>] On Behalf Of John DelSignore
>>> Sent: Wednesday, July 12, 2017 9:52 PM
>>> To: mpiwg-tools at lists.mpi-forum.org <mailto:mpiwg-tools at lists.mpi-forum.org>
>>> Subject: Re: [mpiwg-tools] Intel MPI Backend Breakpoint
>>> 
>>> I'd be interested in being included in that discussion. FWIW, I work on the TotalView debugger and wrote-up the MPIR specification.
>>> 
>>> Cheers, John D.
>>> 
>>> -----Original Message-----
>>> From: mpiwg-tools [mailto:mpiwg-tools-bounces at lists.mpi-forum.org <mailto:mpiwg-tools-bounces at lists.mpi-forum.org>] On Behalf Of Durnov, Dmitry
>>> Sent: Wednesday, July 12, 2017 2:44 PM
>>> To: mpiwg-tools at lists.mpi-forum.org <mailto:mpiwg-tools at lists.mpi-forum.org>
>>> Subject: Re: [mpiwg-tools] Intel MPI Backend Breakpoint
>>> 
>>> Hi Alex,
>>> 
>>> I've started a separate mail thread where we may discuss details.
>>> 
>>> Thanks.
>>> 
>>> BR,
>>> Dmitry
>>> 
>>> -----Original Message-----
>>> From: mpiwg-tools [mailto:mpiwg-tools-bounces at lists.mpi-forum.org <mailto:mpiwg-tools-bounces at lists.mpi-forum.org>] On Behalf Of Alexander Zahdeh
>>> Sent: Wednesday, July 12, 2017 7:27 PM
>>> To: mpiwg-tools at lists.mpi-forum.org <mailto:mpiwg-tools at lists.mpi-forum.org>
>>> Subject: [mpiwg-tools] Intel MPI Backend Breakpoint
>>> 
>>> Hi,
>>> 
>>> This is Alex Zahdeh, one of the debugger tools developers at Cray. I had a question about how Intel MPI handles synchronization according to the MPIR debugging standard. The usual procedure for our debugger is to launch tool daemons to attach to the backend application processes while the application launcher is held at MPIR_Breakpoint. At this point the application process must be in some sort of barrier so the debugger tries to return the user to their own code by setting breakpoints at various initialization symbols for different parallel models, continuing, hitting one of the breakpoints, deleting the rest and finishing the current function. This works if the application is held before the breakpoints we set which does not seem to be the case with Intel MPI. Is there a more standard approach to returning the user to their own code or does it vary by programming model and implementor? And specifically with Intel MPI would there be a good breakpoint to set in this scenari
>>  o?
>>> Thanks much,
>>> Alex
>>> --
>>> Alex Zahdeh | PE Debugger Development | Cray Inc.
>>> azahdeh at cray.com <mailto:azahdeh at cray.com> | Office: 651-967-9628 | Cell: 651-300-2005 _______________________________________________
>>> mpiwg-tools mailing list
>>> mpiwg-tools at lists.mpi-forum.org <mailto:mpiwg-tools at lists.mpi-forum.org>
>>> https://lists.mpi-forum.org/mailman/listinfo/mpiwg-tools <https://lists.mpi-forum.org/mailman/listinfo/mpiwg-tools>
>>> 
>>> --------------------------------------------------------------------
>>> Joint Stock Company Intel A/O
>>> Registered legal address: Krylatsky Hills Business Park,
>>> 17 Krylatskaya Str., Bldg 4, Moscow 121614, Russian Federation
>>> 
>>> This e-mail and any attachments may contain confidential material for the sole use of the intended recipient(s). Any review or distribution by others is strictly prohibited. If you are not the intended recipient, please contact the sender and delete all copies.
>>> 
>>> _______________________________________________
>>> mpiwg-tools mailing list
>>> mpiwg-tools at lists.mpi-forum.org <mailto:mpiwg-tools at lists.mpi-forum.org>
>>> https://lists.mpi-forum.org/mailman/listinfo/mpiwg-tools <https://lists.mpi-forum.org/mailman/listinfo/mpiwg-tools>
>>> _______________________________________________
>>> mpiwg-tools mailing list
>>> mpiwg-tools at lists.mpi-forum.org <mailto:mpiwg-tools at lists.mpi-forum.org>
>>> https://lists.mpi-forum.org/mailman/listinfo/mpiwg-tools <https://lists.mpi-forum.org/mailman/listinfo/mpiwg-tools>
>>> 
>>> --------------------------------------------------------------------
>>> Joint Stock Company Intel A/O
>>> Registered legal address: Krylatsky Hills Business Park,
>>> 17 Krylatskaya Str., Bldg 4, Moscow 121614,
>>> Russian Federation
>>> 
>>> This e-mail and any attachments may contain confidential material for
>>> the sole use of the intended recipient(s). Any review or distribution
>>> by others is strictly prohibited. If you are not the intended
>>> recipient, please contact the sender and delete all copies.
>>> 
>>> _______________________________________________
>>> mpiwg-tools mailing list
>>> mpiwg-tools at lists.mpi-forum.org <mailto:mpiwg-tools at lists.mpi-forum.org>
>>> https://lists.mpi-forum.org/mailman/listinfo/mpiwg-tools <https://lists.mpi-forum.org/mailman/listinfo/mpiwg-tools>
>> 
> 
> _______________________________________________
> mpiwg-tools mailing list
> mpiwg-tools at lists.mpi-forum.org
> https://lists.mpi-forum.org/mailman/listinfo/mpiwg-tools

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpi-forum.org/pipermail/mpiwg-tools/attachments/20170713/2badc4a2/attachment.html>


More information about the mpiwg-tools mailing list