[petsc-users] Fwd: How PETSc solves Ax=b in parallel
paul zhang
paulhuaizhang at gmail.com
Mon Oct 21 10:30:28 CDT 2013
Hi Jed,
I have a question for you.
I am using KSP, more specifically FGMRES method, with MPI to solve Ax=b
system. Here is what I am doing. I cut my computation domain into many
pieces, in each of them I compute independently by solving fluid equations.
This has nothing to do with PETSc. Finally, I collect all of the
information and load it to a whole A matrix. Then I call PETSc functions
and they will solve this system in parallel. Later, I get the update of x,
add it to the initial guess, and do another iteration.
My question is how PETSc functions work in parallel in my case. There are
two guesses to me. First, PETSc solves its own matrix for each domain using
local processor, although A is a global. For the values like number of
iterations, solution vector, their numbers should have equaled to the
number of processors I applied, but I get only one value for each of them.
The reason is that the processors must talk with each other once all of
their work is done, that is why I received the "all reduced" value. This is
more logical than my second guess.
In the second one, the system is solved in parallel too. But PETSc function
redistributes the global sparse matrix A to each of the processors after
its load is complete. That is to say now each processor may not solve the
its own partition matrix.
Which one is right?
Thanks,
Paul
--
Huaibao (Paul) Zhang
*Gas Surface Interactions Lab*
Department of Mechanical Engineering
University of Kentucky,
Lexington,
KY, 40506-0503*
Office*: 216 Ralph G. Anderson Building
*Web*:gsil.engineering.uky.edu
--
Huaibao (Paul) Zhang
*Gas Surface Interactions Lab*
Department of Mechanical Engineering
University of Kentucky,
Lexington,
KY, 40506-0503*
Office*: 216 Ralph G. Anderson Building
*Web*:gsil.engineering.uky.edu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131021/46b721a0/attachment-0001.html>
More information about the petsc-users
mailing list