[Mpi-forum] Error/gap in MPI_NEIGHBOR_ALLTOALL/ALLGATHER
Anthony Skjellum
skjellum at auburn.edu
Fri Sep 27 17:37:25 CDT 2019
Rolf let’s open a Ticket
Anthony Skjellum, PhD
205-807-4968
> On Sep 27, 2019, at 6:09 PM, Rolf Rabenseifner via mpi-forum <mpi-forum at lists.mpi-forum.org> wrote:
>
> Dear MPI collective WG,
>
> you may try to resolve the problem with a maybe wrong
> MPI specification for MPI_NEIGHBOR_ALLTOALL/ALLGATHER
>
> Dear MPI Forum member,
>
> you may own/use an MPI implementation that implements
> MPI_NEIGHBOR_ALLTOALL/ALLGATHER
> with race conditions if #nprocs in one dimension is
> only 1 or 2 and periodic==true
>
> The problem was reported as a bug of the OpenMPI library
> by Simone Chiochetti from DICAM at the University of Trento,
> but seems to be a bug in the MPI specification,
> or at least an advice to implementors is missing.
>
> I produced a set of animated slides.
> Please look at them in presentation mode with animation.
>
> Have fun with a problem that clearly prevents the use
> of MPI_NEIGHBOR_... routines with cyclic boundary condition
> if one wants to verify that mpirun -np 1 is doing
> the same as the sequential code.
>
> Best regards
> Rolf
>
> --
> Dr. Rolf Rabenseifner . . . . . . . . . .. email rabenseifner at hlrs.de .
> High Performance Computing Center (HLRS) . phone ++49(0)711/685-65530 .
> University of Stuttgart . . . . . . . . .. fax ++49(0)711 / 685-65832 .
> Head of Dpmt Parallel Computing . . . www.hlrs.de/people/rabenseifner .
> Nobelstr. 19, D-70550 Stuttgart, Germany . . . . (Office: Room 1.307) .
> <neighbor_mpi-3_bug.pptx>
> _______________________________________________
> mpi-forum mailing list
> mpi-forum at lists.mpi-forum.org
> https://lists.mpi-forum.org/mailman/listinfo/mpi-forum
More information about the mpi-forum
mailing list