[petsc-dev] Integrating PFLOTRAN, PETSC & SAMRAI

Jed Brown jed at 59A2.org
Tue Jun 7 13:32:44 CDT 2011


On Tue, Jun 7, 2011 at 20:17, Boyce Griffith <griffith at cims.nyu.edu> wrote:

> There is not a notion of matrices in SAMRAI --- only vectors.  (I believe
> that SAMRAI was designed for and is still used, at least at LLNL, mainly for
> hyperbolic problems, so they never have to solve large systems of
> equations.)
>

Some hyperbolic problems are stiff....


> At one point, I tried implementing the local part of the matrix-free
> matrices use PETSc Mat's, but performance was modestly lower than a pure
> Fortran implementation, and so I abandoned this.  This was for a
> cell-centered Poisson problem using a 5/7-point finite-volume stencil. I
> think this even held in the case where there was only one patch per
> processor --- i.e., contiguous local storage --- although I may be
> misremembering.
>

What do you mean? Matrix-free _should_ be much faster in cases like this
because there's much less memory to deal with. The assembled matrix is
useful for other things (like preconditioning), but it's a slow way to do
MatMult(). The tradeoffs are a bit different for different discretizations,
assembled matrices are good for lowest-order FEM, or (often) if you actually
solve Riemann problems.


> One place where using PETSc Mat's to provide the local part of the matrix
> seems to win is for Vanka-like smoothers for incompressible Stokes.  In this
> case, performing local solves using PETSc seems to be faster than whatever I
> cobbled together.
>

Interface granularity gets tricky in this case.


> Getting this right can be a lot of work, because parallel AMR data indexing
> is not so simple for data centerings other than cell-centered,
>

There must still be a deterministic way to traverse the degrees of freedom
(using SAMRAI APIs). That's all that is needed to implement this interface.


> An alternative, which I had in mind when I said that I was interested in
> trying to use PETSc to provide the storage for the SAMRAI data structures,
> is to use SAMRAI to provide basic grid management, and to use PETSc to
> provide data storage.  One would allocate appropriately sized PETSc vectors
> and set various pointers appropriately so that the SAMRAI data would
> continue to "look like" regular SAMRAI data.  Because the data is actually
> stored in a PETSc Vec, it would also "look like" regular PETSc data from the
> standpoint of a PETSc solver.  Doing this would mean including some
> redundant ghost cell data in the PETSc Vec that provides the actual storage.
>

It makes a big difference where those redundant entries go. If they can go
at the end, then you can share storage. If they go with "their patch", then
you can't have them share the global vector. A different model is to have
SAMRAIGlobalToLocalBegin/End() that internally used VecScatter (perhaps) and
put the result in a SAMRAI vector. If you are usually running in parallel,
this step usually does a copy anyway (e.g. DMDA).
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110607/d4dcb7ce/attachment.html>


More information about the petsc-dev mailing list