[petsc-dev] Integrating PFLOTRAN, PETSC & SAMRAI

Jed Brown jed at 59A2.org
Tue Jun 7 12:43:39 CDT 2011


On Tue, Jun 7, 2011 at 19:05, Boyce Griffith <griffith at cims.nyu.edu> wrote:

> Another might be easier access to CUDA.


What do SAMRAI matrices look like? In particular, how are they constructed?

Your mention of CUDA offers a different approach. Instead of using the
non-contiguous storage directly, you could copy it to contiguous storage.
Thus you can make a Vec type (or adorn an existing Vec) with SAMRAI
information. Then you would have an interface with two functions

VecSAMRAIGetVector(Vec,SAMRAIVector*);
VecSAMRAIRestoreVector(Vec,SAMRAIVector*);

The first of these would actually create SAMRAI storage if it didn't exist
and copy from there from the Vec. If the SAMRAIVector is modified, a flag is
set in the Vec indicating that the copy "on the SAMRAI" is up to date. If
the Vec is accessed on the PETSc side via VecGetArray() or similar, the data
would be copied over. If it's modified on the PETSc side, the SAMRAI copy
would be marked as invalid.

Reasons for this instead of using a single copy in discontiguous storage:

1. All PETSc solver components would work.

2. Some operations will be faster using contiguous storage.

3. You can easily have a run-time option to switch.


The downside is possibly using more memory, but in most use cases, only a
couple Vecs are actually touched by the application. Most of the Vecs are in
the Krylov space or stored time steps. Those would be lighter and faster
using contiguous storage (PETSc native), and since VecSAMRAIGetVector()
would only be called on the ones the user interacts directly with, it
probably only costs a couple vectors to maintain the multiple copies (which
could mean less storage in total when summed over the whole Krylov space).

What do SAMRAI matrices look like? What is the assembly code like? Can we
preserve that API and get the data to go directly into a PETSc format?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110607/aceeebc0/attachment.html>


More information about the petsc-dev mailing list