[petsc-users] Parallel vs. single processor

Rob Kunz rfk102 at arl.psu.edu
Tue Nov 13 07:16:00 CST 2012


I have successfully implemented the GAMG and I am getting convergence now as expected from a multigrid solver!
Two issues:

1)      Every iteration if the CFD solver the pressure equation takes more solution time. Iteration 1 the code executes at 5e-5 wall secs/cell*iteration. Iteration 1000 the code executes at 3e-4 wall secs/cell*iteration. Steady increase. Why does this happen?

2)      I would like to see some documentation on how to control the GAMG, the online docs are too limited for my modest experience. Also there is only one example and that has very basic implementation. I am particularly interested in trying to get the solver to act only in the strong coefficient direction, with multiple cell agglomeration – i.e., try and make the solver act like a block correction multigrid.
Thanks
Rob

From: petsc-users-bounces at mcs.anl.gov [mailto:petsc-users-bounces at mcs.anl.gov] On Behalf Of Jed Brown
Sent: Monday, November 12, 2012 11:31 AM
To: PETSc users list
Subject: Re: [petsc-users] Parallel vs. single processor

Try (e.g., -pc_type gamg) and additive Schwarz. Also try using ASM as a smoother.

On Mon, Nov 12, 2012 at 10:21 AM, Rob Kunz <rfk102 at arl.psu.edu<mailto:rfk102 at arl.psu.edu>> wrote:
Hi:
I have a pressure based CFD code using PETSC for nearly SPD pressure equation.
PETSC default solver + preconditioner (GMRES+BJ for n-proc) is very reliable for me.
However I have a very tough bearing problem with extremely high aspect ratio cells and the 1-processor run works fine (GMRES+ILU0), but for nproc>1 the code does not converge.
Of course I can get the code to return identical 1-proc and 12-proc convergence histories if I choose a point preconditioner like Jacobi, but that kills the convergence of both.
Any recommendations?
Thanks
Rob Kunz

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121113/4af0bc7d/attachment.html>


More information about the petsc-users mailing list