[petsc-dev] [petsc-maint #40882] Re: ML changing the size of the coarse-level problem
Barry Smith
bsmith at mcs.anl.gov
Tue Jan 26 14:52:50 CST 2010
Jed,
Thanks for the report. It is my intention to fix this. But when I
looked at ml.c I nearly barfed, it is excessively complicated because
I made a mistake in the design of PC_MG since I made it an array of
levels and thus it cannot exist before the number of levels is set,
which from ML is not known up front.
I will change the organization of PCMG slightly by making PC_MG
an object and the information about the levels being stored in an
array attached to the PC_MG. This is a better design anyways. The
reorganization may take a few days. Then I will fix the PETSc bug in
using ML.
Barry
On Jan 26, 2010, at 6:24 AM, Jed Brown wrote:
>> but I think the problem is deeper
>
> Indeed, the size of the arrays in the intepolation operator seems to
> change here. So although the coarse-level problem did not change
> size,
> we cannot reuse the restriction/interpolation operators.
>
> ./ex48 -M 6 -P 4 -thi_mat_type aij -pc_type ml -pc_ml_maxnlevels 3
>
> Program received signal SIGSEGV, Segmentation fault.
> 0x00007ffff66a5048 in MatMultAdd_SeqAIJ_Inode (A=0x8ea870,
> xx=0x938c80, zz=0x8c1fb0, yy=0x8c1fb0) at inode.c:627
> 627 tmp1 = x[i2];
> (gdb) bt
> #0 0x00007ffff66a5048 in MatMultAdd_SeqAIJ_Inode (A=0x8ea870,
> xx=0x938c80, zz=0x8c1fb0, yy=0x8c1fb0) at inode.c:627
> #1 0x00007ffff670e2e5 in MatMultAdd (mat=0x8ea870, v1=0x938c80,
> v2=0x8c1fb0, v3=0x8c1fb0) at matrix.c:2032
> #2 0x00007ffff6740b45 in MatInterpolateAdd (A=0x8ea870, x=0x938c80,
> y=0x8c1fb0, w=0x8c1fb0) at matrix.c:6654
> #3 0x00007ffff7525e32 in PCMGMCycle_Private (pc=0x844600,
> mglevels=0x8d6630, reason=0x0) at mg.c:52
> #4 0x00007ffff7527b4b in PCApply_MG (pc=0x844600, b=0x8bbf10,
> x=0x8c1fb0) at mg.c:193
> #5 0x00007ffff755b402 in PCApply (pc=0x844600, x=0x8bbf10,
> y=0x8c1fb0) at precon.c:357
> #6 0x00007ffff748a4ed in FGMREScycle (itcount=0x7fffffff9e60,
> ksp=0x835b50) at fgmres.c:177
> #7 0x00007ffff748b337 in KSPSolve_FGMRES (ksp=0x835b50) at fgmres.c:
> 303
> #8 0x00007ffff74f8bdf in KSPSolve (ksp=0x835b50, b=0x820f00,
> x=0x89e110) at itfunc.c:396
> #9 0x00007ffff7bc6cac in SNES_KSPSolve (snes=0x818130,
> ksp=0x835b50, b=0x820f00, x=0x89e110) at snes.c:2931
> #10 0x00007ffff7b9611f in SNESSolve_LS (snes=0x818130) at ls.c:191
> #11 0x00007ffff7bc136a in SNESSolve (snes=0x818130, b=0x0,
> x=0x824050) at snes.c:2242
> #12 0x00007ffff7b8e33f in DMMGSolveSNES (dmmg=0x80ed30, level=0) at
> damgsnes.c:510
> #13 0x00007ffff7b86aa6 in DMMGSolve (dmmg=0x80ed30) at damg.c:313
> #14 0x0000000000412f9a in main (argc=11, argv=0x7fffffffd4a8) at
> ex48.c:1284
> (gdb) p i2
> $1 = -1077609817
>
>
> ML is terrible compared to geometric multigrid for this problem, so
> this
> being broken is not currently inhibiting work, but it would be nice to
> include it in a benchmark.
>
> Jed
>
More information about the petsc-dev
mailing list