[petsc-users] Use parallel PETSc linear solver in a sequential program
Jed Brown
jedbrown at mcs.anl.gov
Thu Oct 31 15:06:55 CDT 2013
Qin Lu <lu_qin_2000 at yahoo.com> writes:
> Hello,
>
> I have successfully installed a sequential PETSc's linear solver in a sequential program. The matrix has been created in CSR format and it was passed to MatCreateSeqAIJWithArrays (instead of using MatSetValues).
>
> Now I want to use a parallel PETSc solver in the same sequential program, and use the same CRS matrix mentioned above. I intend to use MatCreateMPIAIJWithArrays, since I hope to pass the array direct to PETSc, instead of using MatSetValues. There a few questions:
Use MatSetValues. It uses less memory and is more flexible.
> 1. Is there a sample code for using MatCreateMPIAIJWithArrays?
Please don't use this function. It only exists to support legacy users
that were already using it. The implementation calls MatSetValues
because the input format is not suitable to actually compute with.
> 2. There seems to be a subroutine MatDistribute_MPIAIJ, but it is not
> in the PETSc manual. Where can I get the instruction how to use it? It
> would be nice if there is a sample code available.
That is a private function. Use MatSetValues.
> 3. Manual says the matrix will be copied by MatCreateMPIAIJWithArrays
> into internal format (while it is not when using
> MatCreateSeqAIJWithArrays), does this mean I have to call
> MatMPIAIJSetPreallocation first?
Not if you call MatCreateMPIAIJWithArrays, but to avoid using the extra
memory (and to simplify your code), you should compute preallocation and
then call MatSetValues once per row, per element, or whatever.
> 4. In sample code ex2.c in the manual, both MatMPIAIJSetPreallocation
> and MatSeqAIJSetPreallocation are called to preallocate the matrix
> A. Is not the first subroutine sufficient?
The example can run in serial or parallel. Only the matching
preallocation is used, any others will be ignored.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 835 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131031/a138d00f/attachment.pgp>
More information about the petsc-users
mailing list