[petsc-dev] asm / gasm
Barry Smith
bsmith at mcs.anl.gov
Sun Jun 26 16:56:44 CDT 2016
I have changed the GAMG aggs support to use PCASM and not PCGASM. I don't see how it ever worked with PCGASM; very strange. Maybe it was written and tested with PCASM and then later someone changed to PCGASM and did not test it again.
I was wrong when I said it would be difficult to change to PCASM, it was much simpler to change to PCASM then I thought; this is because I assumed it had been done correctly for PCGASM while in fact though it used PCGASM in the code it followed the model for what PCASM needed :-(.
Anyways I have a branch barry/fix-gamg-asm-aggs that now works for my ex10 tests I will continue to clean up the branch, fixing naming styles and test with more difficult GAMG examples. And add proper nightly tests for this functionality which it never had before.
In addition Fande is adding error checking to PCGASM so if you pass it badly formatted subdomain information (like was passed from GAMG) it will generate a very useful error message instead of just chugging along with gibberish.
Barry
Mark, my confusion came from that fact that a single MPI process owns each of the aggs; that is the list of degrees of freedom for each agg is all on one process. This is exactly what PCASM needs but NOT what PCGASM needs.
> On Jun 26, 2016, at 2:12 PM, Mark Adams <mfadams at lbl.gov> wrote:
>
>
>
> On Sun, Jun 26, 2016 at 5:43 PM, Fande Kong <fdkong.jd at gmail.com> wrote:
> Mark,
>
> Size of the index set is smaller than size of pmat in your case?
>
> I've added in the BC vertices into a dummy domain, which will just be a diagonal matrix, so there is complete "cover" now. It works in serial.
>
> It does not work in parallel. If I change the number of processor, for ex56 in ksp/kps/examples/tutorials, from 1 to 8 I get this error:
>
> mark/gamg-agg-gasm *= ~/Codes/petsc/src/ksp/ksp/examples/tutorials$ make runex56
> 1,4c1,198
> < Linear solve converged due to CONVERGED_RTOL iterations 8
> < Linear solve converged due to CONVERGED_RTOL iterations 8
> < Linear solve converged due to CONVERGED_RTOL iterations 8
> < [0]main |b-Ax|/|b|=1.940043e-04, |b|=4.969822e+00, emax=9.926090e-01
> ---
> > [4]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> > [4]PETSC ERROR: Nonconforming object sizes
> > [4]PETSC ERROR: Vector wrong size 192 for scatter 201 (scatter forward and vector from != ctx from size)
>
>
> They should be the same. I do not think we ever considered the index set has a different size from that of Pmat.
>
> Fande,
>
> On Sun, Jun 26, 2016 at 7:09 AM, Mark Adams <mfadams at lbl.gov> wrote:
> GASM does assume the index set includes every equation in the matrix. It should probably check this as it has pmat.
>
> I guess I can add these BC vertices in.
>
> On Sun, Jun 26, 2016 at 11:20 AM, Mark Adams <mfadams at lbl.gov> wrote:
> Fande,
>
> An alternative debug path that may be simpler and direct, as it works with master: 'make runex56' in ksp/examples/tutorials in branch mark/gamg-agg-asm.
>
> This runs clean in valgrind (for me), I've added an ISView call to see the data that causes the error, it runs on one processor, uses a 3^3 cell grid (tiny), it exits cleanly with the error:
>
> ....
> 87 189
> 88 190
> 89 191
> > [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> > [0]PETSC ERROR: Argument out of range
> > [0]PETSC ERROR: Index 144 at 42 location greater than max 144
> [0]PETSC ERROR: #1 VecScatterCheckIndices_Private() line 39 in /Users/markadams/Codes/petsc/src/vec/vec/utils/vscat.c
> [0]PETSC ERROR: #2 VecScatterCreate() line 1227 in /Users/markadams/Codes/petsc/src/vec/vec/utils/vscat.c
> [0]PETSC ERROR: #3 PCSetUp_GASM() line 481 in /Users/markadams/Codes/petsc/src/ksp/pc/impls/gasm/gasm.c
> ....
>
> This index 144 is very suspicious to me because there are 144 REAL dofs in this test, 192 "gross" dofs, minus (48) boundary conditions. This problem has just two aggregates with sizes 54 & 90 (=144). (I strip out the BC vertices.)
>
> Maybe GASM is getting confused because I do not give it domains that cover the entire mesh (I strip out BC vertices). GASM thinks there are 144 equations in this system when in fact there are 192.
>
> It looks to me like GASM is working a space stripped of BCs but it is using my indices that are in the full space.
>
> Mark
>
>
>
>
>
More information about the petsc-dev
mailing list