[petsc-users] MG Preconditioning
Gaetan Kenway
kenway at utias.utoronto.ca
Sat Sep 1 16:20:49 CDT 2012
Sorry about the last email...I hit sent it too early.
The full calling sequence is:
call KSPCreate(SUMB_COMM_WORLD, ksp, PETScIerr)
call EChk(PETScIerr,__FILE__,__LINE__)
useAD = .False.
useTranspose = .True.
usePC = .True.
call setupStateResidualMatrix(drdwpret,useAD,usePC,useTranspose)
! dRdwT has already been created
call KSPSetOperators(ksp, dRdWT, dRdwpreT, &
DIFFERENT_NONZERO_PATTERN, PETScIerr)
call EChk(PETScIerr,__FILE__,__LINE__)
call KSPSetFromOptions(ksp, PETScIerr)
call EChk(PETScIerr,__FILE__,__LINE__)
call KSPSetType(ksp, KSPFGMRES, PETScIerr)
call EChk(PETScIerr,__FILE__,__LINE__)
call KSPGMRESSetRestart(ksp, adjRestart, PETScIerr)
call EChk(PETScIerr,__FILE__,__LINE__)
call KSPGMRESSetCGSRefinementType(ksp, &
KSP_GMRES_CGS_REFINE_IFNEEDED, PETScIerr)
call EChk(PETScIerr,__FILE__,__LINE__)
call kspgetpc(ksp, pc, PETScIerr)
call EChk(PETScIerr,__FILE__,__LINE__)
call pcsettype(pc, PCMG, PETScIerr)
call EChk(PETScIerr,__FILE__,__LINE__)
call PetscOptionsSetValue('-pc_mg_levels','2',PETScIerr)
call EChk(PETScIerr,__FILE__,__LINE__)
! Set command line options
call PCSetFromOptions(pc, PETScIerr)
call EChk(PETScIerr,__FILE__,__LINE__)
call PCMGSetGalerkin(pc, PETSC_TRUE, PETScIerr)
call EChk(PETScIerr,__FILE__,__LINE__)
! Create the restriction operator between finest and one below:
call createRestrictOperator(RL1, 1)
call PCMGSetRestriction(pc, 1, RL1, PETScierr)
call EChk(PETScIerr,__FILE__,__LINE__)
call PCSetup(pc, PETScierr)
call EChk(PETScIerr,__FILE__,__LINE__)
The error occurs in PCSetup() which calls PCSetup_MG() and the issue is in
KSPGetVecs() on line 644 in mg.c. Just to clarify, this isn't a memory
problem/segfault. PETSc returns control cleanly and I my code exits when
the error is detected. Therefore a backtrace or valgrind doesn't yield any
additional information. ( I Ran both). The terminal output is:
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR:
or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory
corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: --------------------- Stack Frames
------------------------------------
=================================================================
PETSc or MPI Error. Error Code 73. Detected on Proc 0
Error at line: 130 in file: setupPETScKsp.F90
=================================================================
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR: INSTEAD the line number of the start of the function
[0]PETSC ERROR: is given.
[0]PETSC ERROR: [0] MatGetVecs line 8142
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/mat/interface/matrix.c
[0]PETSC ERROR: [0] KSPGetVecs line 774
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/iterativ.c
[0]PETSC ERROR: [0] PCSetUp_MG line 508
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: [0] PCSetUp line 810
/nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/interface/precon.c
I had a look at ex42.c but it uses a DM which I am not doing. From the
example there is the following:
ierr = PCMGSetLevels(pc,nlevels,PETSC_NULL);CHKERRQ(ierr);
ierr = PCMGSetType(pc,PC_MG_MULTIPLICATIVE);CHKERRQ(ierr);
ierr = PCMGSetGalerkin(pc,PETSC_TRUE);CHKERRQ(ierr);
for (k=1; k<nlevels; k++){
ierr =
DMCreateInterpolation(da_list[k-1],da_list[k],&R,PETSC_NULL);CHKERRQ(ierr);
ierr = PCMGSetInterpolation(pc,k,R);CHKERRQ(ierr);
ierr = MatDestroy(&R);CHKERRQ(ierr);
}
Is there anything special about "R" in this case? As I mentioned before, I
know what the issue is, the coarse grid operators are not set at all. From
what I can tell, they should be set in the routine PCSetup_MG in mg.c, but
the code is never run unless pc->dm is True. I guess I'll just create the
operators myself. Even if I modify the code to make it enter the if
statement at line 577
if (mg->galerkin == 1) {
Mat B;
/* currently only handle case where mat and pmat are the same on
coarser levels */
ierr =
KSPGetOperators(mglevels[n-1]->smoothd,&dA,&dB,&uflag);CHKERRQ(ierr);
if (!pc->setupcalled) {
for (i=n-2; i>-1; i--) {
ierr =
MatPtAP(dB,mglevels[i+1]->interpolate,MAT_INITIAL_MATRIX,1.0,&B);CHKERRQ(ierr);
ierr =
KSPSetOperators(mglevels[i]->smoothd,B,B,uflag);CHKERRQ(ierr);
if (i != n-2) {ierr =
PetscObjectDereference((PetscObject)dB);CHKERRQ(ierr);}
dB = B;
}
if (n > 1) {ierr =
PetscObjectDereference((PetscObject)dB);CHKERRQ(ierr);}
} else {
for (i=n-2; i>-1; i--) {
ierr =
KSPGetOperators(mglevels[i]->smoothd,PETSC_NULL,&B,PETSC_NULL);CHKERRQ(ierr);
ierr =
MatPtAP(dB,mglevels[i+1]->interpolate,MAT_REUSE_MATRIX,1.0,&B);CHKERRQ(ierr);
ierr =
KSPSetOperators(mglevels[i]->smoothd,B,B,uflag);CHKERRQ(ierr);
dB = B;
}
}
which seems like where it should be done....the MatPTAP will not run. I'm
using a mpibaij matrix for 'dRdwPreT' and my restriction matrix 'RL1' is a
mpiaij matrix. When I try to run MatRART myself with the code:
call MatRART(dRdwT, RL1, MAT_INITIAL_MATRIX, 1.0, newMat, PETScIERR)
call EChk(PETScIerr,__FILE__,__LINE__)
I receive error code 56: /* no support for requested operation */ (this is
a clean exit from PETSc)
In conclusion, I have two issues: 1) The coarse grid operators are not
being set automatically in mg.c. 2) Using MatRART() with a mpibaij matrix
and a mpiaij matrix is unsupported.
Is there a PCMG example that doesn't use a DM to generate the
restriction/prolongation/matrix information I can have a look at?
Thanks,
Gaetan
On Sat, Sep 1, 2012 at 4:54 PM, Gaetan Kenway <kenway at utias.utoronto.ca>wrote:
> The full calling sequence is:
>
>
> call KSPCreate(SUMB_COMM_WORLD, ksp, PETScIerr)
> call EChk(PETScIerr,__FILE__,__LINE__)
>
> useAD = .False.
> useTranspose = .True.
> usePC = .True.
> call setupStateResidualMatrix(drdwpret,useAD,usePC,useTranspose)
>
> call KSPSetOperators(ksp, dRdWT, dRdwpreT, &
> DIFFERENT_NONZERO_PATTERN, PETScIerr)
> call EChk(PETScIerr,__FILE__,__LINE__)
>
> call KSPSetFromOptions(ksp, PETScIerr)
> call EChk(PETScIerr,__FILE__,__LINE__)
>
> call KSPSetType(ksp, KSPFGMRES, PETScIerr)
> call EChk(PETScIerr,__FILE__,__LINE__)
>
> call KSPGMRESSetRestart(ksp, adjRestart, PETScIerr)
> call EChk(PETScIerr,__FILE__,__LINE__)
>
> call KSPGMRESSetCGSRefinementType(ksp, &
> KSP_GMRES_CGS_REFINE_IFNEEDED, PETScIerr)
> call EChk(PETScIerr,__FILE__,__LINE__)
>
> call kspgetpc(ksp, pc, PETScIerr)
> call EChk(PETScIerr,__FILE__,__LINE__)
>
> call pcsettype(pc, PCMG, PETScIerr)
> call EChk(PETScIerr,__FILE__,__LINE__)
>
> call PetscOptionsSetValue('-pc_mg_levels','2',PETScIerr)
> call EChk(PETScIerr,__FILE__,__LINE__)
>
> ! Set command line options
>
>
> call PCSetFromOptions(pc, PETScIerr)
> call EChk(PETScIerr,__FILE__,__LINE__)
>
> call PCMGSetGalerkin(pc, PETSC_TRUE, PETScIerr)
> call EChk(PETScIerr,__FILE__,__LINE__)
>
> ! Create the restriction operator between finest and one below:
>
>
> call createRestrictOperator(RL1, 1)
>
> call PCMGSetRestriction(pc, 1, RL1, PETScierr)
> call EChk(PETScIerr,__FILE__,__LINE__)
>
> call PCSetup(pc, PETScierr)
> call EChk(PETScIerr,__FILE__,__LINE__)
>
>
>
> On Sat, Sep 1, 2012 at 3:27 PM, Jed Brown <five9a2 at gmail.com> wrote:
>
>> Can you send the whole calling sequence and the full error message? It's
>> not clear from your email how to reproduce. You might also look at
>> src/ksp/ksp/examples/tutorials/ex42.c which uses PCMGSetGalerkin. Did you
>> set the fine grid operators before calling PCSetUp()? The coarse grid
>> operators should have been constructed earlier in PCSetUp_MG().
>> On Sep 1, 2012 10:05 AM, "Gaetan Kenway" <kenway at utias.utoronto.ca>
>> wrote:
>>
>>> I believe I partially tracked down my problem. The issue arises at line
>>> 644 in mg.c with a call to KSPGetVecs(). This is used to get the coarse
>>> grid vectors for the RHS. The issue is that up until this point, the actual
>>> operator for mglevels[i]->smoothd have not actually been set and therefore
>>> it cannot get the vectors.
>>>
>>> I am using
>>> call PCMGSetGalerkin(pc, PETSC_TRUE, PETScIerr)
>>> call EChk(PETScIerr,__FILE__,__LINE__)
>>>
>>> with the intention of having the coarse matrices approximately
>>> automatically. However, from looking over the code, it appears when it is
>>> used in this manner, the coarse grid operators never get set. From what I
>>> can tell, it is only possible to create these matrices automatically if
>>> (pc->dm) is True which I think means that the preconditioner is derived
>>> from a distributed matrix.
>>>
>>> I've set my own restriction operator, but mg->galerkin has a value of -1
>>> in mg.c and none of the KSPSetOperators() are executed.
>>>
>>> I've also tried to generate the matrices myself using MatRARt() or
>>> MatPtAP, but I receive error code 56 "* /* no support for requested
>>> operation */" . *Is this op only implemented from matrices derived from
>>> from DM objects?
>>>
>>> Any other suggestions for creating the coarse grid operators manually?
>>>
>>> Thank you,
>>>
>>> Gaetan
>>>
>>>
>>> On Sat, Sep 1, 2012 at 12:29 AM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
>>>
>>>> On Fri, Aug 31, 2012 at 11:22 PM, Gaetan Kenway <
>>>> kenway at utias.utoronto.ca> wrote:
>>>>
>>>>> Hi Again
>>>>>
>>>>> I also tried petsc-3.2 version and I still get the same backtrace.
>>>>>
>>>>> If its not possible to figure out where the communicator segfault is
>>>>> coming from its not a huge deal...I've just set the option using
>>>>> PetscOptionsSetValue() and then use PCSetFromOptions() to pull it back out.
>>>>> That seems to work fine.
>>>>>
>>>>> Even avoiding the above problem, with the PetscOptionsSetValue I'm
>>>>> still receiving an error code 73 when I run the multigrid solver. I've
>>>>> included the backtrace output below but its not a lot of help since the
>>>>> code exited cleaning using my error checking procedure
>>>>>
>>>>> =================================================================
>>>>> PETSc or MPI Error. Error Code 73. Detected on Proc 0
>>>>> Error at line: 122 in file: solveADjointTransposePETSc.F90
>>>>> =================================================================
>>>>>
>>>>> Program received signal SIGSEGV, Segmentation fault.
>>>>> 0xb68c98f0 in PMPI_Abort () from /usr/local/lib/libmpi.so.0
>>>>> (gdb) bt
>>>>> #0 0xb68c98f0 in PMPI_Abort () from /usr/local/lib/libmpi.so.0
>>>>> #1 0xb519f190 in pmpi_abort__ () from /usr/local/lib/libmpi_f77.so.0
>>>>> #2 0xb44c9e4c in echk (ierr=@0x49, file=..., line=@0x7a,
>>>>> .tmp.FILE.len_V$eb=30) at terminate.f90:154
>>>>> #3 0xb44bd68f in solveadjointtransposepetsc () at
>>>>> solveADjointTransposePETSc.F90:122
>>>>> #4 0xb44138a9 in f2py_rout_sumb_solveadjointtransposepetsc () from
>>>>> /tmp/tmpKYF_DT/sumb.so
>>>>> #5 0xb440fd35 in fortran_call () from /tmp/tmpKYF_DT/sumb.so
>>>>> #6 0x0805fd6a in PyObject_Call ()
>>>>> #7 0x080dd5b0 in PyEval_EvalFrameEx ()
>>>>> #8 0x080dfbb2 in PyEval_EvalCodeEx ()
>>>>> #9 0x08168f1f in ?? ()
>>>>> #10 0x0805fd6a in PyObject_Call ()
>>>>> #11 0x080dcbeb in PyEval_EvalFrameEx ()
>>>>> #12 0x080dfbb2 in PyEval_EvalCodeEx ()
>>>>> #13 0x080de145 in PyEval_EvalFrameEx ()
>>>>> #14 0x080dfbb2 in PyEval_EvalCodeEx ()
>>>>> #15 0x080dfca7 in PyEval_EvalCode ()
>>>>> #16 0x080fd956 in PyRun_FileExFlags ()
>>>>> #17 0x080fdbb2 in PyRun_SimpleFileExFlags ()
>>>>> #18 0x0805b6d3 in Py_Main ()
>>>>> #19 0x0805a8ab in main ()
>>>>>
>>>>
>>>> This stack doesn't involve PETSc at all.
>>>>
>>>>
>>>>>
>>>>> Valgrid was clean right up until the end where I get the normal error
>>>>> message:
>>>>>
>>>>> [0]PETSC ERROR:
>>>>> ------------------------------------------------------------------------
>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>>>>> probably memory access out of range
>>>>> [0]PETSC ERROR: Try option -start_in_debugger or
>>>>> -on_error_attach_debugger
>>>>> [0]PETSC ERROR: or see
>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try
>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory
>>>>> corruption errors
>>>>> [0]PETSC ERROR: likely location of problem given in stack below
>>>>> [0]PETSC ERROR: --------------------- Stack Frames
>>>>> ------------------------------------
>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
>>>>> available,
>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the
>>>>> function
>>>>> [0]PETSC ERROR: is given.
>>>>> [0]PETSC ERROR: [0] MatGetVecs line 8142
>>>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/mat/interface/matrix.c
>>>>> [0]PETSC ERROR: [0] KSPGetVecs line 774
>>>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/iterativ.c
>>>>> [0]PETSC ERROR: [0] PCSetUp_MG line 508
>>>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/impls/mg/mg.c
>>>>> [0]PETSC ERROR: [0] PCSetUp line 810
>>>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/interface/precon.c
>>>>> [0]PETSC ERROR: [0] KSPSetUp line 182
>>>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itfunc.c
>>>>> [0]PETSC ERROR: [0] KSPSolve line 351
>>>>> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/itfunc.c
>>>>>
>>>>> I will try it on a different system tomorrow to see if I have any more
>>>>> luck.
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Gaetan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Aug 31, 2012 at 11:08 PM, Jed Brown <jedbrown at mcs.anl.gov>wrote:
>>>>>
>>>>>> On Fri, Aug 31, 2012 at 10:06 PM, Matthew Knepley <knepley at gmail.com>wrote:
>>>>>>
>>>>>>> True, but the backtrace also shows that comm = 0x0 on the call to
>>>>>>> KSPCreate(), which
>>>>>>> leads me to believe that your petsc4py has not initialized PETSc,
>>>>>>> and therefor not
>>>>>>> initialized PETSC_COMM_WORLD.
>>>>>>>
>>>>>>
>>>>>> Could catch, maybe PetscFunctionBegin() should check that PETSc has
>>>>>> been initialized.
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120901/55613cc5/attachment-0001.html>
More information about the petsc-users
mailing list