[mpiwg-tools] Vector MPI_T reads

Marc-Andre Hermanns m.a.hermanns at grs-sim.de
Thu Dec 5 09:07:58 CST 2013


Martin,

will there be the opportunity to join via WebEx or Skype? I cannot
promise to join for the full 4 hours, but for some topics it would be
great to join as I did in one of the last meetings.

Cheers,
Marc-Andre

On 04.12.13 16:53, Schulz, Martin wrote:
> I agree - interesting, looking forward to hearing more. I agree also with Kathryn - some of those don't sound like MPI_T, but this could also speak for an integration through PAPI so that tools can grab all data from one source.
> 
> We are getting quite a list to talk about (I'll also have some for the PMPI2 ready and I think we should take a look at Craig's wrapper generator) - I know Jeff is also busy in the hybrid WG, but we could have some of the discussions on Tuesday parallel to RMA if we need/want the time. Fab has two rooms for us all day on Tuesday as well.
> 
> Martin
> 
> 
> On Dec 4, 2013, at 7:36 AM, Kathryn Mohror <kathryn at llnl.gov>
>  wrote:
> 
>>
>>
>>> FWIW, we're getting some interesting asks from users for MPI_T pvars:
>>>
>>> - underlying network data
>>>  - But then how do you correlate it to MPI activity?  Not as easy as you would think.
>>> - other hardware / environment data
>>>  - CPU temperature, fan speed, ...etc. (think: IPMI-like data)
>>>  - Should these be reported once per server, or in every MPI process?
>>> - stats that have traditionally been exposed via PMPI
>>>  - E.g., number of times MPI_Send was invoked
>>> - other MPI layer data
>>>  - E.g., average depth of unexpected queue
>>>
>>> Of this list, I think "other MPI layer data" is the only type of data I expected to expose via MPI_T (i.e., it's kinda what we designed MPI_T for).  The others are all use cases that I was not previously expecting.  Interesting.
>>
>> Some of these are out of the scope that I would expected to be provided by MPI_T and could be provided by other libraries. I agree that I imagined it would be information specific to the MPI library or whatever it was already collecting. But, you are right, interesting!
>>
>> Kathryn
>>
>>>
>>>
>>>
>>> On Dec 4, 2013, at 10:20 AM, Kathryn Mohror <kathryn at llnl.gov> wrote:
>>>
>>>> Yes. I'll put it on the agenda.
>>>>
>>>> Kathryn
>>>>
>>>> On Dec 4, 2013, at 3:27 AM, Jeff Squyres (jsquyres) <jsquyres at cisco.com> wrote:
>>>>
>>>>> Can we add this to the agenda to discuss next week?
>>>>>
>>>>>
>>>>> On Dec 4, 2013, at 1:05 AM, "Schulz, Martin" <schulzm at llnl.gov> wrote:
>>>>>
>>>>>> I think this could be interesting and helpful, but wouldn't this be expensive in the implementation, at least for some variables? Would we need some way to say which variables can be read atomically?
>>>>>>
>>>>>> Martin 
>>>>>>
>>>>>>
>>>>>> On Dec 3, 2013, at 2:27 PM, "Jeff Squyres (jsquyres)" <jsquyres at cisco.com>
>>>>>> wrote:
>>>>>>
>>>>>>> I'm trolling through SC-accumulated emails and saw one that prompted me to ask here (I think I asked about this before, but don't remember): is there any interest in atomic vector reads of MPI_T variables?
>>>>>>>
>>>>>>> I ask because, especially for pvars, if you loop over reading a bunch of them, they're not atomic, and you might not get consistent values.  But if you can issue a single MPI_T read for N values all at once, you have a much better chance of getting an atomic/consistent set of values.
>>>>>>>
>>>>>>> -- 
>>>>>>> Jeff Squyres
>>>>>>> jsquyres at cisco.com
>>>>>>> For corporate legal information go to: http://www.cisco.com/web/about/doing_business/legal/cri/
>>>>>>>
>>>>>>> _______________________________________________
>>>>>>> mpiwg-tools mailing list
>>>>>>> mpiwg-tools at lists.mpi-forum.org
>>>>>>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
>>>>>>
>>>>>> ________________________________________________________________________
>>>>>> Martin Schulz, schulzm at llnl.gov, http://people.llnl.gov/schulzm
>>>>>> CASC @ Lawrence Livermore National Laboratory, Livermore, USA
>>>>>>
>>>>>>
>>>>>>
>>>>>> _______________________________________________
>>>>>> mpiwg-tools mailing list
>>>>>> mpiwg-tools at lists.mpi-forum.org
>>>>>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
>>>>>
>>>>>
>>>>> -- 
>>>>> Jeff Squyres
>>>>> jsquyres at cisco.com
>>>>> For corporate legal information go to: http://www.cisco.com/web/about/doing_business/legal/cri/
>>>>>
>>>>> _______________________________________________
>>>>> mpiwg-tools mailing list
>>>>> mpiwg-tools at lists.mpi-forum.org
>>>>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
>>>>
>>>> ______________________________________________________________
>>>> Kathryn Mohror, kathryn at llnl.gov, http://people.llnl.gov/mohror1
>>>> CASC @ Lawrence Livermore National Laboratory, Livermore, CA, USA
>>>>
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> mpiwg-tools mailing list
>>>> mpiwg-tools at lists.mpi-forum.org
>>>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
>>>
>>>
>>> -- 
>>> Jeff Squyres
>>> jsquyres at cisco.com
>>> For corporate legal information go to: http://www.cisco.com/web/about/doing_business/legal/cri/
>>>
>>> _______________________________________________
>>> mpiwg-tools mailing list
>>> mpiwg-tools at lists.mpi-forum.org
>>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
>>
>> ______________________________________________________________
>> Kathryn Mohror, kathryn at llnl.gov, http://people.llnl.gov/mohror1
>> CASC @ Lawrence Livermore National Laboratory, Livermore, CA, USA
>>
>>
>>
>>
>> _______________________________________________
>> mpiwg-tools mailing list
>> mpiwg-tools at lists.mpi-forum.org
>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
> 
> ________________________________________________________________________
> Martin Schulz, schulzm at llnl.gov, http://people.llnl.gov/schulzm
> CASC @ Lawrence Livermore National Laboratory, Livermore, USA
> 
> 
> 
> _______________________________________________
> mpiwg-tools mailing list
> mpiwg-tools at lists.mpi-forum.org
> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
> 

-- 
Marc-Andre Hermanns
German Research School for
Simulation Sciences GmbH
c/o Laboratory for Parallel Programming
52062 Aachen | Germany

Tel +49 241 80 99753
Fax +49 241 80 6 99753
Web www.grs-sim.de

Members: Forschungszentrum Jülich GmbH | RWTH Aachen University
Registered in the commercial register of the local court of
Düren (Amtsgericht Düren) under registration number HRB 5268
Registered office: Jülich
Executive board: Prof. Marek Behr, Ph.D | Prof. Dr. Sebastian M. Schmidt

-------------- next part --------------
A non-text attachment was scrubbed...
Name: smime.p7s
Type: application/pkcs7-signature
Size: 4581 bytes
Desc: S/MIME Cryptographic Signature
URL: <http://lists.mpi-forum.org/pipermail/mpiwg-tools/attachments/20131205/9b045e1a/attachment-0001.bin>


More information about the mpiwg-tools mailing list