[petsc-dev] asm / gasm
Barry Smith
bsmith at mcs.anl.gov
Sun Jul 3 12:17:37 CDT 2016
> On Jul 3, 2016, at 11:59 AM, Mark Adams <mfadams at lbl.gov> wrote:
>
> Thanks, I saw that. ASM converges slower than SOR :(
You can try -mg_levels_pc_asm_local_type multiplicative that Matt added, but you probably need to switch to GMRES since I don't think Matt did a symmetric version of it.
>
> On Sun, Jul 3, 2016 at 6:50 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> Here is the problem:
>
> if (!sz) {
> IS nis;
> ierr = ISCreateGeneral(PETSC_COMM_SELF, 0, NULL, PETSC_COPY_VALUES, &nis);CHKERRQ(ierr);
> ierr = PCASMSetLocalSubdomains(subpc, 1, NULL, &nis);CHKERRQ(ierr);
> ierr = ISDestroy(&nis);CHKERRQ(ierr);
>
> It was setting the first IS when it should be setting the second. When I run after this change I get reasonable results.
>
> 0 KSP Residual norm 2.469493285658e+02
> 1 KSP Residual norm 1.335901394170e+01
> 2 KSP Residual norm 5.623002716761e-01
> 3 KSP Residual norm 5.201650164984e-02
> 4 KSP Residual norm 3.763050190737e-03
> 5 KSP Residual norm 3.024410926322e-04
> Linear solve converged due to CONVERGED_RTOL iterations 5
> 0 KSP Residual norm 2.469493285658e-03
> 1 KSP Residual norm 1.335901394170e-04
> 2 KSP Residual norm 5.623002716761e-06
> 3 KSP Residual norm 5.201650164983e-07
> 4 KSP Residual norm 3.763050190737e-08
> 5 KSP Residual norm 3.024410926322e-09
> Linear solve converged due to CONVERGED_RTOL iterations 5
> 0 KSP Residual norm 2.469493285658e-08
> 1 KSP Residual norm 1.335901394170e-09
> 2 KSP Residual norm 5.623002716761e-11
> 3 KSP Residual norm 5.201650164984e-12
> 4 KSP Residual norm 3.763050190737e-13
> 5 KSP Residual norm 3.024410926322e-14
>
>
>
> > On Jul 3, 2016, at 10:40 AM, Mark Adams <mfadams at lbl.gov> wrote:
> >
> > Barry,
> >
> > GAMG must be setting up the ASM incorrectly, on empty processors, when using the aggregates. In asm.c, I see that osm->is_local is NULL for empty processors, and this causes a problem when this branch is taken by less than 8 processes (of 8):
> >
> > if (osm->is_local) { // asm.c:282
> > ....
> > ierr = VecScatterCreate(vec,osm->is_local[i],osm->y_local[i],isl,&osm->prolongation[i]);CHKERRQ(ierr);
> >
> > I get an error "MPI_Allreduce() called in different locations (code lines) on different processors ", because "vec" here is global. This gamg.c code must not be correct:
> >
> > if (!sz) {
> > IS is;
> > ierr = ISCreateGeneral(PETSC_COMM_SELF, 0, NULL, PETSC_COPY_VALUES, &is);CHKERRQ(ierr);
> > ierr = PCASMSetLocalSubdomains(subpc, 1, &is, NULL);CHKERRQ(ierr);
> > ierr = ISDestroy(&is);CHKERRQ(ierr);
> > } else {
> > ....
> > }
> >
> > I will look into why osm->is_local seems to be NULL here when it should have an empty IS.
> >
> > Mark
>
>
More information about the petsc-dev
mailing list