[petsc-dev] https://www.dursi.ca/post/hpc-is-dying-and-mpi-is-killing-it.html

Matthew Knepley knepley at gmail.com
Mon Mar 18 11:23:19 CDT 2019


On Mon, Mar 18, 2019 at 12:15 PM Zhang, Junchao via petsc-dev <
petsc-dev at mcs.anl.gov> wrote:

> Let's see how the author thought about PETSc. The author likes Chapel -- a
> PGAS language. In https://www.dursi.ca/post/julia-vs-chapel.html he said
> his concerns about Chapel
>      "the beginnings of a Chapel-native set of solvers from Scalapack or
> PETSc (both of which are notoriously hard to get started with, and in
> PETSc’s case, even install)"
>
> His slides have more,
> "
> PETSc is a widely used library for large sparse iterative solves.
>         Excellent and comprehensive library of solvers
>         It is the basis of a significant number of home-made
> simulation codes
>         It is notoriously hard to start getting running with; nontrivial
> even for experts to install.
>

This is a typical parochial take by someone with very limited experience.
People who routinely install a lot of libraries
say that the install is very smooth. People like this who deal with only
their own F90 and nothing else are scared. I would
point out that if you want nothing else, pip install petsc works fine. I
can't believe we have spent this much time on an idiot.

I agree with your point below, which is made in an expanded fashion by Bill
in his acceptance speech for the Ken Kennedy Award.

   Matt


>   Significant fraction of PETSc functionality is tied up in large CSR
> matrices of reasonable structure partitioned by row, vectors, and
> solvers built on top.
>   What would a Chapel API to PETSc look like?
>   What would a Chapel implementation of some core PETSc solvers look like?
> "
> In my view, the good and the evil of MPI grow from one root:  MPI has a
> local name space. MPI does not try to define a global data structure. The
> evil is users have to do their global naming, which can be very easy (e.g.,
> in stencil) or very hard (e.g., refine an unstructured mesh). The good is
> user has the freedom to design their own data structure (array, CSR, tree,
> hash table, mesh, ...).
> PGAS languages tried to provide a set of global data structures, but which
> is very limited and did not meet requirements of many HPC codes. MPI
> challengers should start with AMR, but not VecAXPY.
>
> --Junchao Zhang
>
>
> On Sun, Mar 17, 2019 at 3:12 PM Smith, Barry F. via petsc-dev <
> petsc-dev at mcs.anl.gov> wrote:
>
>>
>>   I stubbled on this today; I should have seen it years ago.
>>
>>   Barry
>>
>>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20190318/9ee69609/attachment.html>


More information about the petsc-dev mailing list