[mpiwg-tools] MPI_T names of variables

Marc-Andre Hermanns m.a.hermanns at grs-sim.de
Fri Jan 10 14:53:49 CST 2014


Hi Martin,

> I always expected that if a variable named "foo" exists in two
> processes, it collects the same metric with the same semantics.

Yes, that is what I would hope for, too, but I think as it is written at
the moment a fully standard compliant implementation does not have to
work this way and there seems no viable way of finding out what the
situation is, right?

> When we wrote the paragraph we (or at least I) had in mind that the
> set of variables can be different (i.e., if "foo" exists on A, it
> does not have to exist on B) and the order can be different.

That is what I took from the first sentence in the advice to users,
where it is explicitly stated that number and type of variables can
change even between runs. That is fine, I guess, apart from the other
issues that Michael raised on building a PAPI component.

> Not sure if we can formalize this or if people would go along with
> it, but it would certainly be advantageous for the tools.

It would be a question for the MPI vendors, whether it would impose a
serious restriction on their implementation to say that a name of a
variable should not change semantics for a certain version of the
implementation.

But then again, what does "version" mean and how to define it ... :-/

> However, having said that, I can actually envision scenarios where
> this would be hard to achieve. If you have a heterogeneous system
> (two architectures/networks or MPI bridging GPU and CPU) it may be a
> combination of two separate MPIs (with separate name spaces) or the
> same metric may have different meanings (similar to Flop counts
> having separate meanings on different architectures). I'd be curious
> to hear what our implementors think about that. In the homogeneous
> case, though, I would fully expect some consistency between
> processes.

As Michael pointed out in his reply, the "realm" idea could provide the
means to define namespaces in a way that tools could query. However, I
am still unsure whether something like this is overkill for the
situation or whether it potentially solves only a non-existent problem.

Cheers,
Marc-Andre

> On Jan 10, 2014, at 10:52 AM, Marc-Andre Hermanns <m.a.hermanns at grs-sim.de>
>  wrote:
> 
>> Dear all,
>>
>> I hope all of you had a good start into the new year.
>>
>> Let me apologize in advance for the lengthy text following.
>>
>> I am currently in the discussion with the developers of the metric
>> modules for Scalasca and Score-P, and I think we need some
>> clarification. Please correct me in my understanding if anything of the
>> following is not correct.
>>
>> Here is the current quote of the advice to users in the ticket PDF:
>>
>> "[...]Further, there is no guarantee that number of variables, variable
>> indices, and variable names are the same across processes."
>>
>> I read the following out of this:
>> A variable indicating a metric X may be called "foo" on one process and
>> "bar" on another.
>>
>> Is that correct?
>>
>> a) How should a tool aggregate and correlate metrics if there is no way
>> on knowing which belong together (after all indices and names are
>> different)?
>>
>> b) Does this allow the namespaces of metrics to overlap?
>> If the variable for metric X is "foo" on one process and "bar" on
>> another. Is the other process allowed to have a variable "foo" that
>> actually reports some different metric? How will a tool get the difference?
>>
>> I know that the new "Advice to implementers" is supposed to clarify
>> this. But I am unsure whether making this a sole "quality of
>> implementation" issue will work for providers of portable tools.
>>
>> Also, with differences among runs, tools like the Cube Algebra (where
>> you can compute averages of multiple measurements) do not work anymore,
>> as Cube will not know whether "foo" of run one correlates to "foo" of
>> run two, rendering the whole tool moot.
>>
>> Cheers,
>> Marc-Andre
>> -- 
>> Marc-Andre Hermanns
>> German Research School for
>> Simulation Sciences GmbH
>> c/o Laboratory for Parallel Programming
>> 52062 Aachen | Germany
>>
>> Tel +49 241 80 99753
>> Fax +49 241 80 6 99753
>> Web www.grs-sim.de
>>
>> Members: Forschungszentrum Jülich GmbH | RWTH Aachen University
>> Registered in the commercial register of the local court of
>> Düren (Amtsgericht Düren) under registration number HRB 5268
>> Registered office: Jülich
>> Executive board: Prof. Marek Behr, Ph.D | Prof. Dr. Sebastian M. Schmidt
>>
>> _______________________________________________
>> mpiwg-tools mailing list
>> mpiwg-tools at lists.mpi-forum.org
>> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
> 
> _______________________________________________
> mpiwg-tools mailing list
> mpiwg-tools at lists.mpi-forum.org
> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-tools
> 

-- 
Marc-Andre Hermanns
German Research School for
Simulation Sciences GmbH
c/o Laboratory for Parallel Programming
52062 Aachen | Germany

Tel +49 241 80 99753
Fax +49 241 80 6 99753
Web www.grs-sim.de

Members: Forschungszentrum Jülich GmbH | RWTH Aachen University
Registered in the commercial register of the local court of
Düren (Amtsgericht Düren) under registration number HRB 5268
Registered office: Jülich
Executive board: Prof. Marek Behr, Ph.D | Prof. Dr. Sebastian M. Schmidt

-------------- next part --------------
A non-text attachment was scrubbed...
Name: smime.p7s
Type: application/pkcs7-signature
Size: 4581 bytes
Desc: S/MIME Cryptographic Signature
URL: <http://lists.mpi-forum.org/pipermail/mpiwg-tools/attachments/20140110/3a142317/attachment-0001.bin>


More information about the mpiwg-tools mailing list