[petsc-users] KSPGMRESOrthog costs too much time
Jed Brown
jedbrown at mcs.anl.gov
Thu Nov 17 12:50:23 CST 2011
On Thu, Nov 17, 2011 at 12:18, Rongliang Chen <rongliang.chan at gmail.com>wrote:
> I am using the composite pc now not the PCMG:
>
> ierr = PCCompositeAddPC(finepc,PCSHELL);CHKERRQ(ierr);
> ierr = PCCompositeAddPC(finepc,PCASM);CHKERRQ(ierr);
>
> ierr = PCCompositeGetPC(finepc,0,&coarsesolve);CHKERRQ(ierr);
> ierr = PCShellSetContext(coarsesolve,ctx);CHKERRQ(ierr);
> ierr = PCShellSetApply(coarsesolve,
> CoarseSolvePCApply);CHKERRQ(ierr);
>
> ierr = PCCompositeGetPC(finepc,1,&asmpc);CHKERRQ(ierr);
> ierr = PCSetOptionsPrefix(asmpc,"fine_");CHKERRQ(ierr);
> ierr = PCSetFromOptions(asmpc);CHKERRQ(ierr);
>
> ierr = PCSetType(asmpc,PCASM);CHKERRQ(ierr);
> ierr = PCASMSetOverlap(asmpc,0);CHKERRQ(ierr);
> ierr = PCASMSetLocalSubdomains(asmpc,1,&grid->df_global_asm,
> PETSC_NULL);CHKERRQ(ierr);
>
You can make your own event for the coarse level solve. Using PCMG instead
of PCComposite would make your code more flexible, so you may want to
consider doing it at some point.
>
> I just use two level method now and it is not very easy to try more levels
> since I am using unstructure meshes.
> I tried to solve the coarse level exactly by LU and it works well if the
> coarse problem is small. When the coarse problem is large, LU is also very
> slow (when the fine level problem is large, I can not use very small coarse
> level problem). I found that nearly 90% of the time is spent on the coarse
> level when the number of processor is large (np >512), so I want to know
> which step of the coarse level solver costs the most of the time. Thanks.
>
This is a common problem. Depending on your equations, you might be able to
solve the coarse-level problem using algebraic multigrid. Try
-coarse_pc_type gamg (or however you set up the options prefixes; maybe
also "hypre" or "ml" if you have those installed).
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20111117/81aa79c0/attachment.htm>
More information about the petsc-users
mailing list