[petsc-users] Fwd: How PETSc solves Ax=b in parallel
Jed Brown
jedbrown at mcs.anl.gov
Mon Oct 21 10:42:30 CDT 2013
paul zhang <paulhuaizhang at gmail.com> writes:
> I am using KSP, more specifically FGMRES method, with MPI to solve Ax=b
> system. Here is what I am doing. I cut my computation domain into many
> pieces, in each of them I compute independently by solving fluid equations.
> This has nothing to do with PETSc. Finally, I collect all of the
> information and load it to a whole A matrix.
I hope you build parts of this matrix on each processor, as is done in
the examples. Note the range Istart to Iend here:
http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex2.c.html
> My question is how PETSc functions work in parallel in my case. There are
> two guesses to me. First, PETSc solves its own matrix for each domain using
> local processor, although A is a global. For the values like number of
> iterations, solution vector, their numbers should have equaled to the
> number of processors I applied, but I get only one value for each of them.
> The reason is that the processors must talk with each other once all of
> their work is done, that is why I received the "all reduced" value. This is
> more logical than my second guess.
It does not work because the solution operators are global, so to solve
the problem, the iteration must be global.
> In the second one, the system is solved in parallel too. But PETSc function
> redistributes the global sparse matrix A to each of the processors after
> its load is complete. That is to say now each processor may not solve the
> its own partition matrix.
Hopefully you build the matrix already-distributed. The default
_preconditioner_ is local, but the iteration is global. PETSc does not
"redistribute" the matrix automatically, though if you call
MatSetSizes() and pass PETSC_DECIDE for the local sizes, PETSc will
choose them.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 835 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131021/75dd3250/attachment.pgp>
More information about the petsc-users
mailing list