<html>
<head>
<meta content="text/html; charset=ISO-8859-1"
http-equiv="Content-Type">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<font face="Ubuntu">Hi Hong,<br>
<br>
I have some problems with the block jacobi preconditioner.<br>
I am solving a 3D Poisson equation with periodic BCs, discretized
by using finite differences (7-points stencil).<br>
Thus the problem is singular and the nullspace has to be removed.<br>
If I solve with the PCG method + JACOBI preconditioner the results
are fine.<br>
If I use PCG + Block Jacobi preconditioner + ICC on each block the
results are fine on the majority of the processors,<br>
but on few of them the error is very large. <br>
Do you have any idea/suggestions on how to fix this problem? <br>
This is the fragment of code I am using ( petsc 3.1 and Fortran
90):<br>
<br>
PetscErrorCode petsc_err <br>
Mat A<br>
PC pc, subpc<br>
KSP ksp<br>
KSP subksp(1)<br>
:<br>
:<br>
:<br>
call KSPCreate(PETSC_COMM_WORLD,ksp,petsc_err)<br>
call
KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,petsc_err)<br>
call KSPGetPC(ksp,pc,petsc_err)<br>
call PCSetType(pc,PCBJACOBI,petsc_err)<br>
call KSPSetUp(ksp,petsc_err) <br>
<br>
! KSP context for each single block<br>
call
PCBJacobiGetSubKSP(pc,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,subksp(1),petsc_err)
<br>
call KSPGetPC(subksp(1),subpc,petsc_err)<br>
call PCSetType(subpc,PCICC,petsc_err)<br>
call KSPSetType(subksp(1),KSPCG, petsc_err)<br>
call KSPSetTolerances(subksp(1),tol
,PETSC_DEFAULT_DOUBLE_PRECISION,&<br>
&
PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,petsc_err)<br>
<br>
! Remove nullspace from the singular system (Check PETSC_NULL)<br>
call
MatNullSpaceCreate(MPI_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,nullspace,petsc_err)<br>
call KSPSetNullSpace(ksp, nullspace, petsc_err)<br>
call MatNullSpaceRemove(nullspace, b, PETSC_NULL,petsc_err)
<br>
<br>
call KSPSolve(ksp,b,x,petsc_err)<br>
<br>
<br>
<br>
Thank you,<br>
<br>
Michele<br>
<br>
<br>
<br>
<br>
</font><font face="Ubuntu"><br>
<br>
<br>
</font>
<div class="moz-cite-prefix">On 07/13/2012 12:14 PM, Hong Zhang
wrote:<br>
</div>
<blockquote
cite="mid:CAGCphBsZstbvDcZ1t+py+iSkGEge0_5ZQxA7qnD1NYdajkqCjg@mail.gmail.com"
type="cite">Michele :
<div class="gmail_quote">
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">
<div text="#000000" bgcolor="#FFFFFF"><font face="Ubuntu"> <br>
I need to use the ICC factorization as preconditioner, but
I noticed that no parallel version is supported.<br>
Is that correct?<br>
</font></div>
</blockquote>
<div>Correct.</div>
<div> </div>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">
<div text="#000000" bgcolor="#FFFFFF"> <font face="Ubuntu">
If so, is there a work around, like building the
preconditioner "by hand" by using PETSc functions?<br>
</font></div>
</blockquote>
<div>You may try block jacobi with icc in the blocks '-ksp_type
cg -pc_type bjacobi -sub_pc_type icc'</div>
<div><br>
</div>
<div>Hong </div>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">
<div text="#000000" bgcolor="#FFFFFF"> </div>
</blockquote>
</div>
<br>
</blockquote>
<br>
<br>
</body>
</html>