[Mpi3-tools] MPI3 tools WG Mission Statement

Marc-Andre Hermanns m.a.hermanns at fz-juelich.de
Wed Sep 2 13:46:30 CDT 2009


Hi Martin,

> The goal of this working group is define interfaces that can be used 
> by both performance and correctness tools to gain information about
> internals of the MPI library, interactions between an application and
> the MPI library, as well as the system environment an MPI application
> runs in. This is intended as an extension to the existing and widely
> used PMPI interface. These efforts will help in providing reliable
> and portable interfaces for MPI tools with new functionality
> currently not covered by the MPI standard.

Does 'correctness' tools include debuggers? It immediately triggered
things like "Marmot" and not directly Totalview, DDT, etc., as I would
usually not refer to a debugger as a correctness checker (but then, I am
not a native speaker).

To make that a little clearer I would propose the following change:

The goal of this working group is to define interfaces that can be used
by third party tools to obtain internal information about the MPI
library, as well as the system environment an MPI application runs in.
These tools include performance analysis tools, correctness checkers and
debuggers. These interfaces are complementary to the existing PMPI
interface. These efforts will help in providing reliable and portable
interfaces for MPI tools with new functionality currently not covered by
the MPI standard.

> In particular, this WG is aiming at the standardization of the  
> following interfaces:
> - A scalable version of a process acquisition interface to identify  
> and locate all
>    processes which are part of an MPI job.
> - An interface to look for and locate tool DLLs.
> - An interface to inspect message queues.
> - An interface to understand progress in collective communication.
> - An interface to query additional semantic information for opaque MPI  
> handles.
> - A performance information interface that allows MPI implementations  
> to export
>    additional low level performance information including, but not  
> limited to, the
>    operating state of the MPI library.

The time frame that was discussed today for the MPI 3.x effort was 3+1
years. That is 3 years until 3.0 is ready, and an additional year for
something like 3.1 (Jeff, feel free to correct me on this). I don't have
the experience yet, to estimate whether it is doable in this time frame,
but the list seems to be quite long. All the items on the list seems
valid points of interest, though.

Best regards,
Marc-Andre
-- 
Marc-Andre Hermanns
Juelich Supercomputing Centre
Institute for Advanced Simulation
Forschungszentrum Juelich GmbH
D-52425 Juelich
Germany

Phone : +49-2461-61-2054
Fax   : +49-2461-61-6656
eMail : m.a.hermanns at fz-juelich.de
WWW   : http://www.fz-juelich.de/jsc/

JSC is the coordinator of the
John von Neumann Institute for Computing
and member of the
Gauss Centre for Supercomputing

Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzende des Aufsichtsrats: MinDir'in Baerbel Brumme-Bothe
Geschaeftsfuehrung: Prof. Dr. Achim Bachem (Vorsitzender),
                    Dr. Ulrich Krafft (stellv. Vorsitzender),
                    Prof. Dr. Harald Bolt,
                    Prof. Dr. Sebastian M. Schmidt
-------------- next part --------------
A non-text attachment was scrubbed...
Name: smime.p7s
Type: application/x-pkcs7-signature
Size: 6042 bytes
Desc: S/MIME Cryptographic Signature
URL: <http://lists.mpi-forum.org/pipermail/mpiwg-tools/attachments/20090902/08603a25/attachment-0001.bin>


More information about the mpiwg-tools mailing list