<html>
<head>
<meta content="text/html; charset=ISO-8859-1"
http-equiv="Content-Type">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<font face="Ubuntu">Thank you.<br>
<br>
Michele<br>
<br>
</font>
<div class="moz-cite-prefix">On 07/18/2012 09:09 AM, Satish Balay
wrote:<br>
</div>
<blockquote cite="mid:alpine.LFD.2.02.1207181108530.23520@asterix"
type="cite">
<pre wrap="">Its now useable on Intrepid with PETSC_DIR=/soft/apps/libraries/petsc/3.3-p2/xl-opt
Check /soft/apps/libraries/petsc/README
Satish
On Tue, 17 Jul 2012, Michele Rosso wrote:
</pre>
<blockquote type="cite">
<pre wrap="">Thank a lot.
Please let me know when version 3.3 is available.
Michele
On 07/17/2012 12:13 PM, Barry Smith wrote:
</pre>
<blockquote type="cite">
<blockquote type="cite">
<blockquote type="cite">
<pre wrap="">Please update to petsc-3.3. petsc-3.1 is too old.
</pre>
</blockquote>
<pre wrap=""> I would do that but the version installed on the platform (Intrepid
at ALCF) I am working on is 3.1-p2.
</pre>
</blockquote>
<pre wrap=""> Satish,
Please fix this.
Thanks
Barry
On Jul 17, 2012, at 1:36 PM, Michele Rosso wrote:
</pre>
<blockquote type="cite">
<pre wrap="">On 07/17/2012 11:03 AM, Hong Zhang wrote:
</pre>
<blockquote type="cite">
<pre wrap="">Michele :
I have some problems with the block jacobi preconditioner.
I am solving a 3D Poisson equation with periodic BCs, discretized by
using finite differences (7-points stencil).
Thus the problem is singular and the nullspace has to be removed.
For Poisson equations, multigrid precondition should be the method of
choice.
</pre>
</blockquote>
<pre wrap="">Thank you for the suggestion. I do not have any experience with multigrid,
but I will try.
</pre>
<blockquote type="cite">
<pre wrap="">If I solve with the PCG method + JACOBI preconditioner the results are
fine.
If I use PCG + Block Jacobi preconditioner + ICC on each block the
results are fine on the majority of the processors,
but on few of them the error is very large.
How do you know " few of them"?
</pre>
</blockquote>
<pre wrap=""> Basically the solution is not correct on some grid points, say 6 grid
nodes out of 64^3. The 6 grid nodes with problems belongs to 2 of the 128
processors
I am using.
</pre>
<blockquote type="cite">
<pre wrap="">Do you have any idea/suggestions on how to fix this problem?
This is the fragment of code I am using ( petsc 3.1 and Fortran 90):
Please update to petsc-3.3. petsc-3.1 is too old.
</pre>
</blockquote>
<pre wrap=""> I would do that but the version installed on the platform (Intrepid
at ALCF) I am working on is 3.1-p2.
</pre>
<blockquote type="cite">
<pre wrap=""> PetscErrorCode petsc_err
Mat A
PC pc, subpc
KSP ksp
KSP subksp(1)
:
:
:
call KSPCreate(PETSC_COMM_WORLD,ksp,petsc_err)
call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,petsc_err)
call KSPSetType(ksp,KSPCG, ) !the default type is gmres. I guess you
want CG
call KSPGetPC(ksp,pc,petsc_err)
call PCSetType(pc,PCBJACOBI,petsc_err)
! call KSPSetUp(ksp,petsc_err) call this at the end
! KSP context for each single block
call
PCBJacobiGetSubKSP(pc,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,subksp(1),petsc_err)
call KSPGetPC(subksp(1),subpc,petsc_err)
call PCSetType(subpc,PCICC,petsc_err)
call KSPSetType(subksp(1),KSPPREONLY petsc_err)
call KSPSetTolerances(subksp(1),tol
,PETSC_DEFAULT_DOUBLE_PRECISION,&
&
PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,petsc_err)
! Remove nullspace from the singular system (Check PETSC_NULL)
call
MatNullSpaceCreate(MPI_COMM_WORLD,PETSC_TRUE,0,PETSC_NULL,nullspace,petsc_err)
call KSPSetNullSpace(ksp, nullspace, petsc_err)
call MatNullSpaceRemove(nullspace, b, PETSC_NULL,petsc_err)
call KSPSolve(ksp,b,x,petsc_err)
I modified your code slightly. All these options can be provided at
runtime:
'-ksp_type cg -pc_type bjacobi -sub_pc_type icc'
Hong
On 07/13/2012 12:14 PM, Hong Zhang wrote:
</pre>
<blockquote type="cite">
<pre wrap="">Michele :
I need to use the ICC factorization as preconditioner, but I noticed
that no parallel version is supported.
Is that correct?
Correct.
If so, is there a work around, like building the preconditioner "by
hand" by using PETSc functions?
You may try block jacobi with icc in the blocks '-ksp_type cg
-pc_type bjacobi -sub_pc_type icc'
Hong
</pre>
</blockquote>
<pre wrap="">
</pre>
</blockquote>
<pre wrap="">
</pre>
</blockquote>
<pre wrap="">
</pre>
</blockquote>
<pre wrap="">
</pre>
</blockquote>
<pre wrap="">
</pre>
</blockquote>
<br>
</body>
</html>