[MPIWG Fortran] MPI_STATUS_SIZE: Why only in Fortran?

William Gropp wgropp at illinois.edu
Tue Sep 8 08:43:29 CDT 2020


I honestly can’t remember why we restricted MPI_STATUS_SIZE to Fortran - my best guess is one Jeff mentions, that we were maintaining the possibility that the C and Fortran statues could be a different size, and the possible confusion with sizeof(MPI_Status) is real.  I’d go with a distinct name that included FORTRAN as an alternative.

Bill

William Gropp
Director and Chief Scientist, NCSA
Thomas M. Siebel Chair in Computer Science
University of Illinois Urbana-Champaign






> On Sep 8, 2020, at 8:32 AM, Jeff Squyres (jsquyres) via mpiwg-fortran <mpiwg-fortran at lists.mpi-forum.org> wrote:
> 
> Question: why is MPI_STATUS_SIZE only defined in Fortran?
> 
> MPI-3.1:2.5.4 p15:36 specifically states that MPI_STATUS_SIZE is only defined in Fortran.  Why isn't is also defined in C?
> 
> I ask because I was just writing a test for the MPI_Status conversion functions in Open MPI.  The test spans both Fortran and C.  But in order to test the MPI_Status_f2c / MPI_Status_c2f functions (which are only defined in C, mind you), I wanted to make a stack-allocated array of MPI_Fint's for the INTEGER-array-style Fortran status.  ...but I couldn't, because MPI_STATUS_SIZE isn't defined in C.  Indeed, there's no mechanism in C for knowing how long an INTEGER-style Fortran status array needs to be -- even though we have functions in C for dealing with INTEGER-style Fortran statuses.
> 
> To be totally clear, you can't do this in C:
> 
>  MPI_Fint f_status[MPI_STATUS_SIZE];
> 
> Is this an oversight?  Shouldn't we define MPI_STATUS_SIZE in all languages?
> 
> Or, perhaps if there's possible confusion between MPI_STATUS_SIZE in C and sizeof(MPI_Status) (since they could actually be different values), perhaps we could call it MPI_FORTRAN_STATUS_SIZE in C?  (Or even MPI_FORTRAN_INTEGER_STATUS_SIZE to differentiate it from sizeof(TYPE(MPI_Status))?
> 
> Thoughts?
> 
> If we view this as an errata, I think there's still time to get an update in for MPI-4.
> 
> -- 
> Jeff Squyres
> jsquyres at cisco.com
> 
> _______________________________________________
> mpiwg-fortran mailing list
> mpiwg-fortran at lists.mpi-forum.org
> https://lists.mpi-forum.org/mailman/listinfo/mpiwg-fortran

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpi-forum.org/pipermail/mpiwg-fortran/attachments/20200908/6234bb5f/attachment-0001.html>


More information about the mpiwg-fortran mailing list