[mpiwg-tools] Vector MPI_T reads
Kathryn Mohror
kathryn at llnl.gov
Mon Dec 9 08:55:57 CST 2013
Hi all,
Sorry for the delay on the agenda. Hopefully, this will reach you in enough time for planning on joining remotely.
We have a lot of items to discuss and I have a feeling we will spend more time on them than we expect and won't get through them all today. We can continue tomorrow if there is a room available. If anyone wants to join remotely tomorrow, let us know and we'll try to set it up.
- ticket #384
- ticket #383
- Fortran wrapper generator
- PMPI2
- error codes for MPI_T_ENUM functions
- atomic vector reads of pvars
- can strings in an enum change over time?
- multiple device support -- convention for associating pvars with descriptive enum
If remote attendees are interested in discussing particular items and have time constraints, let me know and we can try to discuss those when you are on the webex.
Also, if anyone notes an item I missed, please let me know so we can add it to the agenda.
Kathryn
On Dec 5, 2013, at 10:26 AM, Kathryn Mohror <kathryn at llnl.gov> wrote:
> And I'll try to send out a rough agenda by Sunday so people can plan on when they want to be on the webex.
>
> Kathryn
>
> On Dec 5, 2013, at 8:58 AM, Jeff Squyres (jsquyres) <jsquyres at cisco.com> wrote:
>
>> Sure, we can do webexes. I'll put them on the web site.
>>
>>
>> On Dec 5, 2013, at 10:07 AM, Marc-Andre Hermanns <m.a.hermanns at grs-sim.de> wrote:
>>
>>> Martin,
>>>
>>> will there be the opportunity to join via WebEx or Skype? I cannot
>>> promise to join for the full 4 hours, but for some topics it would be
>>> great to join as I did in one of the last meetings.
>>>
>>> Cheers,
>>> Marc-Andre
>>>
>>> On 04.12.13 16:53, Schulz, Martin wrote:
>>>> I agree - interesting, looking forward to hearing more. I agree also with Kathryn - some of those don't sound like MPI_T, but this could also speak for an integration through PAPI so that tools can grab all data from one source.
>>>>
>>>> We are getting quite a list to talk about (I'll also have some for the PMPI2 ready and I think we should take a look at Craig's wrapper generator) - I know Jeff is also busy in the hybrid WG, but we could have some of the discussions on Tuesday parallel to RMA if we need/want the time. Fab has two rooms for us all day on Tuesday as well.
>>>>
>>>> Martin
>>>>
>>>>
>>>> On Dec 4, 2013, at 7:36 AM, Kathryn Mohror <kathryn at llnl.gov>
>>>> wrote:
>>>>
>>>>>
>>>>>
>>>>>> FWIW, we're getting some interesting asks from users for MPI_T pvars:
>>>>>>
>>>>>> - underlying network data
>>>>>> - But then how do you correlate it to MPI activity? Not as easy as you would think.
>>>>>> - other hardware / environment data
>>>>>> - CPU temperature, fan speed, ...etc. (think: IPMI-like data)
>>>>>> - Should these be reported once per server, or in every MPI process?
>>>>>> - stats that have traditionally been exposed via PMPI
>>>>>> - E.g., number of times MPI_Send was invoked
>>>>>> - other MPI layer data
>>>>>> - E.g., average depth of unexpected queue
>>>>>>
>>>>>> Of this list, I think "other MPI layer data" is the only type of data I expected to expose via MPI_T (i.e., it's kinda what we designed MPI_T for). The others are all use cases that I was not previously expecting. Interesting.
>>>>>
>>>>> Some of these are out of the scope that I would expected to be provided by MPI_T and could be provided by other libraries. I agree that I imagined it would be information specific to the MPI library or whatever it was already collecting. But, you are right, interesting!
>>>>>
>>>>> Kathryn
>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Dec 4, 2013, at 10:20 AM, Kathryn Mohror <kathryn at llnl.gov> wrote:
>>>>>>
>>>>>>> Yes. I'll put it on the agenda.
>>>>>>>
>>>>>>> Kathryn
>>>>>>>
>>>>>>> On Dec 4, 2013, at 3:27 AM, Jeff Squyres (jsquyres) <jsquyres at cisco.com> wrote:
>>>>>>>
>>>>>>>> Can we add this to the agenda to discuss next week?
>>>>>>>>
>>>>>>>>
>>>>>>>> On Dec 4, 2013, at 1:05 AM, "Schulz, Martin" <schulzm at llnl.gov> wrote:
>>>>>>>>
>>>>>>>>> I think this could be interesting and helpful, but wouldn't this be expensive in the implementation, at least for some variables? Would we need some way to say which variables can be read atomically?
>>>>>>>>>
>>>>>>>>> Martin
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Dec 3, 2013, at 2:27 PM, "Jeff Squyres (jsquyres)" <jsquyres at cisco.com>
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>>> I'm trolling through SC-accumulated emails and saw one that prompted me to ask here (I think I asked about this before, but don't remember): is there any interest in atomic vector reads of MPI_T variables?
>>>>>>>>>>
>>>>>>>>>> I ask because, especially for pvars, if you loop over reading a bunch of them, they're not atomic, and you might not get consistent values. But if you can issue a single MPI_T read for N values all at once, you have a much better chance of getting an atomic/consistent set of values.
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> Jeff Squyres
>>>>>>>>>> jsquyres at cisco.com
>>>>>>>>>> For corporate legal information go to: http://www.cisco.com/web/about/doing_business/legal/cri/
>>>>>>>>>>
>>>>>>>>>> _______________________________________________
>>>>>>>>>> mpiwg-tools mailing list
>>>>>>>>>> mpiwg-tools at lists.mpi-forum.org
>>>>>>>>>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
>>>>>>>>>
>>>>>>>>> ________________________________________________________________________
>>>>>>>>> Martin Schulz, schulzm at llnl.gov, http://people.llnl.gov/schulzm
>>>>>>>>> CASC @ Lawrence Livermore National Laboratory, Livermore, USA
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> _______________________________________________
>>>>>>>>> mpiwg-tools mailing list
>>>>>>>>> mpiwg-tools at lists.mpi-forum.org
>>>>>>>>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> Jeff Squyres
>>>>>>>> jsquyres at cisco.com
>>>>>>>> For corporate legal information go to: http://www.cisco.com/web/about/doing_business/legal/cri/
>>>>>>>>
>>>>>>>> _______________________________________________
>>>>>>>> mpiwg-tools mailing list
>>>>>>>> mpiwg-tools at lists.mpi-forum.org
>>>>>>>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
>>>>>>>
>>>>>>> ______________________________________________________________
>>>>>>> Kathryn Mohror, kathryn at llnl.gov, http://people.llnl.gov/mohror1
>>>>>>> CASC @ Lawrence Livermore National Laboratory, Livermore, CA, USA
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> _______________________________________________
>>>>>>> mpiwg-tools mailing list
>>>>>>> mpiwg-tools at lists.mpi-forum.org
>>>>>>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Jeff Squyres
>>>>>> jsquyres at cisco.com
>>>>>> For corporate legal information go to: http://www.cisco.com/web/about/doing_business/legal/cri/
>>>>>>
>>>>>> _______________________________________________
>>>>>> mpiwg-tools mailing list
>>>>>> mpiwg-tools at lists.mpi-forum.org
>>>>>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
>>>>>
>>>>> ______________________________________________________________
>>>>> Kathryn Mohror, kathryn at llnl.gov, http://people.llnl.gov/mohror1
>>>>> CASC @ Lawrence Livermore National Laboratory, Livermore, CA, USA
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> mpiwg-tools mailing list
>>>>> mpiwg-tools at lists.mpi-forum.org
>>>>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
>>>>
>>>> ________________________________________________________________________
>>>> Martin Schulz, schulzm at llnl.gov, http://people.llnl.gov/schulzm
>>>> CASC @ Lawrence Livermore National Laboratory, Livermore, USA
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> mpiwg-tools mailing list
>>>> mpiwg-tools at lists.mpi-forum.org
>>>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
>>>>
>>>
>>> --
>>> Marc-Andre Hermanns
>>> German Research School for
>>> Simulation Sciences GmbH
>>> c/o Laboratory for Parallel Programming
>>> 52062 Aachen | Germany
>>>
>>> Tel +49 241 80 99753
>>> Fax +49 241 80 6 99753
>>> Web www.grs-sim.de
>>>
>>> Members: Forschungszentrum Jülich GmbH | RWTH Aachen University
>>> Registered in the commercial register of the local court of
>>> Düren (Amtsgericht Düren) under registration number HRB 5268
>>> Registered office: Jülich
>>> Executive board: Prof. Marek Behr, Ph.D | Prof. Dr. Sebastian M. Schmidt
>>>
>>> _______________________________________________
>>> mpiwg-tools mailing list
>>> mpiwg-tools at lists.mpi-forum.org
>>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
>>
>>
>> --
>> Jeff Squyres
>> jsquyres at cisco.com
>> For corporate legal information go to: http://www.cisco.com/web/about/doing_business/legal/cri/
>>
>> _______________________________________________
>> mpiwg-tools mailing list
>> mpiwg-tools at lists.mpi-forum.org
>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
>
> ______________________________________________________________
> Kathryn Mohror, kathryn at llnl.gov, http://people.llnl.gov/mohror1
> CASC @ Lawrence Livermore National Laboratory, Livermore, CA, USA
>
>
>
>
> _______________________________________________
> mpiwg-tools mailing list
> mpiwg-tools at lists.mpi-forum.org
> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
______________________________________________________________
Kathryn Mohror, kathryn at llnl.gov, http://people.llnl.gov/mohror1
CASC @ Lawrence Livermore National Laboratory, Livermore, CA, USA
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpi-forum.org/pipermail/mpiwg-tools/attachments/20131209/bd3cdd85/attachment-0001.html>
More information about the mpiwg-tools
mailing list