[mpiwg-coll] Error/gap in MPI_NEIGHBOR_ALLTOALL/ALLGATHER

Rolf Rabenseifner rabenseifner at hlrs.de
Fri Sep 27 17:09:24 CDT 2019


Dear MPI collective WG,

   you may try to resolve the problem with a maybe wrong 
   MPI specification for MPI_NEIGHBOR_ALLTOALL/ALLGATHER

Dear MPI Forum member,

   you may own/use an MPI implementation that implements
   MPI_NEIGHBOR_ALLTOALL/ALLGATHER
   with race conditions if #nprocs in one dimension is
   only 1 or 2 and periodic==true

The problem was reported as a bug of the OpenMPI library 
by Simone Chiochetti from DICAM at the University of Trento, 
but seems to be a bug in the MPI specification,
or at least an advice to implementors is missing.

I produced a set of animated slides.
Please look at them in presentation mode with animation.

Have fun with a problem that clearly prevents the use
of MPI_NEIGHBOR_... routines with cyclic boundary condition
if one wants to verify that mpirun -np 1 is doing
the same as the sequential code.

Best regards
Rolf

-- 
Dr. Rolf Rabenseifner . . . . . . . . . .. email rabenseifner at hlrs.de .
High Performance Computing Center (HLRS) . phone ++49(0)711/685-65530 .
University of Stuttgart . . . . . . . . .. fax ++49(0)711 / 685-65832 .
Head of Dpmt Parallel Computing . . . www.hlrs.de/people/rabenseifner .
Nobelstr. 19, D-70550 Stuttgart, Germany . . . . (Office: Room 1.307) .
-------------- next part --------------
A non-text attachment was scrubbed...
Name: neighbor_mpi-3_bug.pptx
Type: application/vnd.openxmlformats-officedocument.presentationml.presentation
Size: 345804 bytes
Desc: not available
URL: <http://lists.mpi-forum.org/pipermail/mpiwg-coll/attachments/20190928/5a20d7e5/attachment-0001.pptx>


More information about the mpiwg-coll mailing list