[Mpi-forum] MPI International Survey
William Gropp
wgropp at illinois.edu
Fri Dec 7 16:54:30 CST 2018
I too found some of these questions impossible to answer because none of the available answers was accurate enough, and I gave up rather than provide misleading results.
I do think that this specific question is almost ok, but I really think that it is better to address the requirements for not programming *directly* in MPI. Such a question would be better at getting at the root cause of a change, including identifying lack of understanding of the capabilities of MPI.
The example in this space that drives me as nuts as this drove Jeff is talks that say “We have to use XX to support dynamic tasking instead of using MPI” only to discover, half way through the talk, that XX is implemented successfully *entirely* on top of MPI. The root issue is not “I can’t do this with MPI”, it is “I want more support for my specific needs than is part of MPI itself”. Particularly because one of the original goals of MPI was to support the construction of tools, this is a *vital* distinction, making it clear the need for *both* MPI and for higher-level tools. It is not an either-or situation.
Such questionnaires are hard to write; I volunteer to provide feedback should you want to revise it.
Bill
William Gropp
Director and Chief Scientist, NCSA
Thomas M. Siebel Chair in Computer Science
University of Illinois Urbana-Champaign
> On Dec 6, 2018, at 11:07 PM, Jeff Hammond via mpi-forum <mpi-forum at lists.mpi-forum.org> wrote:
>
> Please stop perpetuating the myth that MPI and PGAS are opposed by asking questions that support this false dichotomy.
>
> Lots of folks use MPI via other APIs, whether they be PETSc or Global Arrays. Thanks to MPI-3 RMA, PGAS can be just another abstraction layer on top of MPI.
>
> Do you have any plan (to investigate) to switch from using MPI to using any other parallel language/library? *
> A PGAS language (UPC, Coarray Fortran, OpenSHMEM, XcalableMP, ...)
>
> GCC/OpenCoarrays and Intel MPI both use MPI-3 RMA as a/the communication layer. OSHMPI is very close to a 1:1 mapping between OpenSHMEM and MPI-3 RMA. UPC could be ported to MPI-3 RMA with dynamic windows if someone cared.
>
> Jeff
>
> On Thu, Dec 6, 2018 at 3:19 PM Atsushi HORI via mpi-forum <mpi-forum at lists.mpi-forum.org <mailto:mpi-forum at lists.mpi-forum.org>> wrote:
> Hello, MPI Forum members,
>
> I, George Bosilca (UT/ICL) and Emmanuel Jeannot (Inria) are working on conducting international MPI survey to reveal the differences among countries and/or regions around the world. I had a chance to talk on this project at the MPI Forum meeting in San Jose and many attendees agreed to help our project. You can find the attached PDF file which was presented at the MPI Forum meeting.
>
> I would like you to go to the GoogleDoc (Google account is required)
>
> https://docs.google.com/forms/d/e/1FAIpQLSfWLWsuhr4opqvn1FL5c8p7Ysz-oclZEMlBAzEGnBZkaaIiKQ/viewform <https://docs.google.com/forms/d/e/1FAIpQLSfWLWsuhr4opqvn1FL5c8p7Ysz-oclZEMlBAzEGnBZkaaIiKQ/viewform>
>
> to fill in your answers. At the end of the survey, I added some questions only for MPI forum members so that you can leave your comments on our survey.
>
> It is designed to be short and easy. I am quite sure you can answer all questions in 5 minutes. I hope some MPI Forum attendees can answer the survey at airport while you are waiting for your flight.
>
> The deadline for MPI forum members is next <<<< Tuesday 11, Dec >>>>.
>
> The survey in production will be conducted in the upcoming January 2019, hopefully. The first draft report will be available to all of you in April 2019.
>
> -----
> Atsushi HORI
> ahori at riken.jp <mailto:ahori at riken.jp>
> https://www.sys.r-ccs.riken.jp <https://www.sys.r-ccs.riken.jp/>
>
>
>
> _______________________________________________
> mpi-forum mailing list
> mpi-forum at lists.mpi-forum.org <mailto:mpi-forum at lists.mpi-forum.org>
> https://lists.mpi-forum.org/mailman/listinfo/mpi-forum <https://lists.mpi-forum.org/mailman/listinfo/mpi-forum>
>
>
> --
> Jeff Hammond
> jeff.science at gmail.com <mailto:jeff.science at gmail.com>
> http://jeffhammond.github.io/ <http://jeffhammond.github.io/>_______________________________________________
> mpi-forum mailing list
> mpi-forum at lists.mpi-forum.org
> https://lists.mpi-forum.org/mailman/listinfo/mpi-forum
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpi-forum.org/pipermail/mpi-forum/attachments/20181207/6f0a260e/attachment.html>
More information about the mpi-forum
mailing list