[petsc-users] Iterative solver in Petsc on a multiprocessor computer

Shri abhyshr at mcs.anl.gov
Thu Sep 8 14:08:36 CDT 2011


Amrit, 
These plots are just reiterating the problem you've stated previously but provide no clue as to what's the issue here. I suggest the following : 
Try running your code with a direct solver, mpiexec -n 6 test3 -ksp_type preonly -pc_factor_mat_solver_package mumps, and see if you are getting the expected results on 6 processors. If you do get the expected results using a direct solver then, as Jed points out, the iterative solver options need to be tuned as Jed points. Note that while a direct solver may be able to solve your problem it is not scalable hence iterative solvers are prefered. If the results are not correct then perhaps something is amiss with your code. 
The default 'parallel' preconditioner in PETSc is the block-jacobi preconditioner with an ILU(0) factorization which may not be a suitable preconditioner on more number of processors. 
Try 
mpiexec -n 6 test3 -ksp_type gmres -pc_type bjacobi -sub_pc_type lu 


This does an LU factorization on each block instead of ILU(0) 


Perhaps, you should also attach the code, and the command line options. 


Shri 

----- Original Message -----



Please find the attached plots where I've compared results obtained from 2 processors (2.pdf) numerical simulation in PETSc with analytical result and they agree fine, but for 6 processors (6.pdf) simulation, the numerical result is a complete garbage. 

My major confusion is this : why does the default GMRES *works* for 2 processors and not for 6 processors? GMRES algorithm is definitely not the problem here because otherwise it would not give me correct result when I run my simulation on 2 processors. Please note that I *do not* change anything in my code other than the number of processors with this command at run time : 
mpiexec -n 2 test3 

Also, I make sure that I write the output to a separate file. It is very hard for me to believe when you say : 
"(a) the problem you are building is not actually the same or the results are being misinterpreted or " 

It is the same. Like I said above, I am only changing the number of processors at run time. 

"(b) the default methods are not working well and needs some algorithmic adjustments for your problem." 

if this were an issue, then why does it work for two processors? 


The problem I am simulating is a vector wave equation in two dimensions (or in other words, Maxwell's equations). The coefficient matrix is assembled using finite difference frequency domain method on a Yee grid. 


I will appreciate any help in the regard. 

Thanks. 
-Amrit 




Date: Thu, 8 Sep 2011 20:07:04 +0200 
From: jedbrown at mcs.anl.gov 
To: petsc-users at mcs.anl.gov 
Subject: Re: [petsc-users] Fwd: nonzero prescribed boundary condition 


On Thu, Sep 8, 2011 at 19:57, amrit poudel < amrit_pou at hotmail.com > wrote: 



After running my simulation multiple times on a multiprocessor computer I've just verified that using iterative solver (default gmres) in PETSc to solve a linear system of equations ( Cx=b) with more than 2 processors setting ALWAYS lead to erroneous result. Running identical code with identical setting except for the number of processors ( set this to 2) ALWAYS gives me correct result . 



You have to explain what "erroneous result" means here. 




I am really not sure what is the point behind including iterative solvers if they result into erroneous result on a multiprocessor computer. The result I get from multiprocessor computer is a complete garbage, so I am really not talking about small percentage of error here. Also, if somebody could enlighten why the iterative solvers are error prone on multiprocessors that will be highly appreciated. 

Well let's not jump to conclusions. Iterative solvers can fail, as can direct solvers, but it's more common that (a) the problem you are building is not actually the same or the results are being misinterpreted or (b) the default methods are not working well and needs some algorithmic adjustments for your problem. 


Please explain what kind of problem you are solving, how you are going about it, and what symptoms you have observed. 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110908/8eb2ff5c/attachment.htm>


More information about the petsc-users mailing list