[petsc-users] question about PCMG

Barry Smith bsmith at mcs.anl.gov
Fri Sep 24 15:17:32 CDT 2010


  Randy,

   In general if the number of levels is larger than 1 you cannot just "turn on" MG with a command line flag. You need to add to your code the computation of interpolation/restriction between levels and set with PCMGSetInterpolation().

   That said, if you only use 1 level then it does not need any interpolation/restriction and should in theory run without crashing (of course it is just using one level of multigrid and hence is not any different then your previous solver). 

   So why does it crash? You need to use -start_in_debugger and see exactly why it crashes?

   Barry

On Sep 24, 2010, at 3:12 PM, Randall Mackie wrote:

> I am interested in exploring whether or not the PCMG would be beneficial for my problem which I currently
> solve using a KSP of bcgs and an ILU PC on a DA.
> 
> So, as a first step, I just wanted to see what happens if I switched my PC from ILU to PCMG using the
> Galerkin option.
> 
> So my command line options are:
> 
> mpiexec -np 1 /home/rmackie/d3fwd/V3_2_PETSc_DA_PCMG/d3fwd \
>          -ksp_monitor_true_residual \
>          -ksp_type bcgs \
>          -pc_type mg \
>          -pc_mg_levels 1 \
>          -pc_mg_cycles v \
>          -pc_mg_galerkin \
>          -help \
> << EOF
> 
> 
> However, I keep getting this error message:
> 
> Preconditioner (PC) Options -------------------------------------------------
>   -pc_type <ilu>: Preconditioner (one of) none jacobi pbjacobi bjacobi sor lu shell mg
>       eisenstat ilu icc cholesky asm ksp composite redundant nn mat fieldsplit galerkin exotic openmp asa cp bfbt lsc redistribute (PCSetType)
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
>   Multigrid options
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
> [0]PETSC ERROR:       INSTEAD the line number of the start of the function
> [0]PETSC ERROR:       is given.
> [0]PETSC ERROR: [0] PCSetFromOptions_MG line 318 src/ksp/pc/impls/mg/mg.c
> [0]PETSC ERROR: [0] PCSetFromOptions line 166 src/ksp/pc/interface/pcset.c
> [0]PETSC ERROR: [0] KSPSetFromOptions line 320 src/ksp/ksp/interface/itcl.c
> [0]PETSC ERROR: --------------------- Error Message ------------------------------------
> [0]PETSC ERROR: Signal received!
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 4, Fri Jul 30 14:42:02 CDT 2010
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: /home/rmackie/d3fwd/V3_2_PETSc_DA_PCMG/d3fwd on a linux-int named he999.prod.houston.nam.slb.com by rmackie Fri Sep 24 13:06:12 2010
> [0]PETSC ERROR: Libraries linked from /home/rmackie/SPARSE/PETsc/petsc-3.1-p4/linux-intel-debug/lib
> [0]PETSC ERROR: Configure run at Fri Sep 24 13:02:26 2010
> [0]PETSC ERROR: Configure options --with-fortran --with-fortran-kernels=1 --with-blas-lapack-dir=/opt/intel/cmkl/10.2.2.025/lib/em64t -with-scalar-type=complex --with-debugging=1 --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
> 
> 
> I am running the latest version of Petsc. Any help on getting past this error message would be appreciated.
> 
> Randy M.
> 



More information about the petsc-users mailing list