[MPIWG Fortran] ticket 351

Jeff Hammond jeff.science at gmail.com
Mon Nov 9 11:43:55 CST 2015


MPI implementations today are remarkably standardized w.r.t. process
launching such that standard arguments are handled prior to MPI_Init.
However, there have been times where that was not the case.

Can you check what the Cray T3D does w.r.t. arguments passed to the host
node appearing on the compute nodes? :-)

More seriously, does Fortran 2008 require that arguments passed to the
invocation of mpirun (or equivalent) appear at every process in a
multi-process MPI program?

Jeff

Jeff

On Mon, Nov 9, 2015 at 8:56 AM, Bill Long <longb at cray.com> wrote:

>
> On Nov 9, 2015, at 8:56 AM, Jeff Hammond <jeff.science at gmail.com> wrote:
>
> >
> >
> > On Mon, Nov 9, 2015 at 6:37 AM, Bill Long <longb at cray.com> wrote:
> >
> > On Nov 9, 2015, at 8:24 AM, Jeff Hammond <jeff.science at gmail.com> wrote:
> >
> > > Did you read the ticket?  We need this for the same reason MPI_Init()
> takes argc/argv even though C main() already has them.
> >
> > But the C standard does not implicitly assume every program is parallel.
> >
> >
> > Correct.  Nor can MPI assume that argc/argv will magically appear on the
> compute nodes when the job is launched from somewhere else.
> >
>
> Agreed. My point is that Fortran is not the same as C.
>
> > >
> > > While it may be true that process managers are magically fixing
> argc/argv, this is not guaranteed by any standard.
> >
> > The Fortran standard is intentionally vague to allow for environments
> where there is no command line, but rather (for example) the code starts
> executing by clicking a graphic button on the screen.  The standard is
> likely to not include specific words about program launchers that might
> turn into bit rot in the future.   If your favorite compiler returns the
> launcher text as part of the command line, complain.
> >
> >
> > Complaining is fun, but has no teeth without a standard.  Many vendors
> are quite strict about standards and reject feature requests that improve
> user experience if they lack justification in a standard.  I can cite
> numerous vendors here…
>
> The overriding intent of a language standard is portability.  You would
> have poor portability of a code if moving it from one machine to another
> resulted in different command argument information from get_command or
> get_command_argument.  The intention is that the routine return the command
> line arguments that the program knows about, not some cryptic stuff from
> aprun, srun, or mpirun.   I did try all the compilers I have access to on a
> trivial code:
>
> > cat test.f90
> program test
>   character(1000) command
>   integer         length
>   call get_command (command, length)
>   print *, command(1:length)
> end program test
>
> > ftn test.f90
> > aprun -n1 ./a.out -x
>  ./a.out -x
> Application 15961022 resources: utime ~0s, stime ~0s, Rss ~4176, inblocks
> ~4479, outblocks ~11453
> > module swap PrgEnv-cray PrgEnv-intel
> > ftn test.f90
> > aprun -n1 ./a.out -x
>  ./a.out -x
> Application 15961023 resources: utime ~0s, stime ~0s, Rss ~4176, inblocks
> ~2967, outblocks ~7756
> > module swap PrgEnv-intel PrgEnv-pgi
> > ftn test.f90
> > aprun -n1 ./a.out -x
>  ./a.out -x
> Application 15961025 resources: utime ~0s, stime ~0s, Rss ~4172, inblocks
> ~3038, outblocks ~8308
> l> module swap PrgEnv-pgi PrgEnv-gnu
> > ftn test.f90
> > aprun -n1 ./a.out -x
>  ./a.out -x
> Application 15961026 resources: utime ~0s, stime ~0s, Rss ~4172, inblocks
> ~3031, outblocks ~8027
>
> All of them produced the expected output (excluding the “aprun -n1”
> launcher text).
>
> Perhaps we could add a Note in the Fortran standard explaining this, but
> it looks like vendors have already figured it out.
>
> Cheers,
> Bill
>
>
>
> >
> > While the new proposed routines are trivial to implement (assuming they
> have the same arguments as the corresponding Fortran ones),  they add to
> one of MPI’s most serious flaws - way to many routines already.
> >
> >
> > As you say, they are trivial to implement.  The size of the standard is
> an irrelevant argument.  Users don't read what they don't need to read and
> users shouldn't be reading the spec most of the time anyways.
> >
> > Jeff
> >
> > Cheers,
> > Bill
> >
> >
> > >
> > > Jeff
> > >
> > > On Mon, Nov 9, 2015 at 6:21 AM, Bill Long <longb at cray.com> wrote:
> > > Hi Jeff,
> > >
> > > I don’t see the value of this.  The Fortran intrinsics GET_COMMAND and
> friends should already do what you want.  Fortran programs are now presumed
> parallel. The intrinsics should know how to strip off the “launcher” part
> of the command line.  At least that is how the Cray versions have worked
> from the beginning.   Experiments are  in order for other vendors, as a
> check.
> > >
> > > Cheers,
> > > Bill
> > >
> > >
> > >
> > > On Nov 9, 2015, at 8:13 AM, Jeff Hammond <jeff.science at gmail.com>
> wrote:
> > >
> > > > we did a lot of work on
> https://svn.mpi-forum.org/trac/mpi-forum-web/ticket/351.  do we intend to
> move forward this for MPI 4+?  if yes, we need to migrate the trac ticket
> to a github issue.
> > > >
> > > > jeff
> > > >
> > > > --
> > > > Jeff Hammond
> > > > jeff.science at gmail.com
> > > > http://jeffhammond.github.io/
> > > > _______________________________________________
> > > > mpiwg-fortran mailing list
> > > > mpiwg-fortran at lists.mpi-forum.org
> > > > http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-fortran
> > >
> > > Bill Long
>          longb at cray.com
> > > Fortran Technical Support  &                                  voice:
> 651-605-9024
> > > Bioinformatics Software Development                     fax:
> 651-605-9142
> > > Cray Inc./ Cray Plaza, Suite 210/ 380 Jackson St./ St. Paul, MN 55101
> > >
> > >
> > > _______________________________________________
> > > mpiwg-fortran mailing list
> > > mpiwg-fortran at lists.mpi-forum.org
> > > http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-fortran
> > >
> > >
> > >
> > > --
> > > Jeff Hammond
> > > jeff.science at gmail.com
> > > http://jeffhammond.github.io/
> > > _______________________________________________
> > > mpiwg-fortran mailing list
> > > mpiwg-fortran at lists.mpi-forum.org
> > > http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-fortran
> >
> > Bill Long
>        longb at cray.com
> > Fortran Technical Support  &                                  voice:
> 651-605-9024
> > Bioinformatics Software Development                     fax:
> 651-605-9142
> > Cray Inc./ Cray Plaza, Suite 210/ 380 Jackson St./ St. Paul, MN 55101
> >
> >
> > _______________________________________________
> > mpiwg-fortran mailing list
> > mpiwg-fortran at lists.mpi-forum.org
> > http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-fortran
> >
> >
> >
> > --
> > Jeff Hammond
> > jeff.science at gmail.com
> > http://jeffhammond.github.io/
> > _______________________________________________
> > mpiwg-fortran mailing list
> > mpiwg-fortran at lists.mpi-forum.org
> > http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-fortran
>
> Bill Long
>      longb at cray.com
> Fortran Technical Support  &                                  voice:
> 651-605-9024
> Bioinformatics Software Development                     fax:  651-605-9142
> Cray Inc./ Cray Plaza, Suite 210/ 380 Jackson St./ St. Paul, MN 55101
>
>
> _______________________________________________
> mpiwg-fortran mailing list
> mpiwg-fortran at lists.mpi-forum.org
> http://lists.mpi-forum.org/mailman/listinfo.cgi/mpiwg-fortran
>



-- 
Jeff Hammond
jeff.science at gmail.com
http://jeffhammond.github.io/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpi-forum.org/pipermail/mpiwg-fortran/attachments/20151109/4cc8b190/attachment-0001.html>


More information about the mpiwg-fortran mailing list