Cg/asm doesn't scale

Matthew Knepley knepley at gmail.com
Thu Apr 23 14:50:16 CDT 2009


On Thu, Apr 23, 2009 at 2:13 PM, Nguyen, Hung V ERDC-ITL-MS <
Hung.V.Nguyen at usace.army.mil> wrote:

> Hello Matt,
>
> >ILU is incredibly unpredictable.
>
> I got the same result when running without setting ILU in sub_pc_type. It
> seems to me that direct solver is set up as default when solving for
> sub_ksp_type(?). Please let me know if it is not correct.
>

You can see what the default is using -ksp_view


> >You have not provided the Jacobi numbers, but the particular nonzero
> pattern
> of the non-overlapping matrix must be
> >much more amenable. Also, this is a really really bad preconditioner for
> your system.
>
> Indeed, the asm preconditioner is not really bad preconditioner for some of
> my ill-conditioned systems. In some other SPD linear systems, I have found
> that cg with asm preconditioner converges better than others. And, it does
> scale well within the size of matrix, see attached file. However, it
> doesn't
> scale in this case. Here is the solver time for cg/jacobi. The performance
> of
> cg/asm is better than cg/jacobi in the range from 1 to 4 processors.
>
>        Number Pes   Solver Time (secs)   #it      Solver Time(secs)
>                     cg/asm                          cg/jacobi
>        1                31.317345       544         1999.276566
>        2               263.172225       6959        1188.067975
>        4             734.828840        23233         984.062940
>        8             805.217591        41250         538.102407
>        16            611.813716        49262         308.547316
>        32            345.331928        49792         170.074248
>        64            212.084555        53771          92.398144
>
> >I would put my time into figuring out why my system is so ill-conditioned
> and try to formulate a good preconditioner, like an approximate system,
> etc.
>
> The linear system is from groundwater flow in a water repellent soil that
> can
> cause a very ill-conditioned linear system.


This is why people develop special purpose discretizations for these
problems.

  Matt


>
> -Hung
>
> -----Original Message-----
> From: petsc-users-bounces at mcs.anl.gov
> [mailto:petsc-users-bounces at mcs.anl.gov] On Behalf Of Matthew Knepley
> Sent: Thursday, April 23, 2009 11:12 AM
> To: PETSc users list
> Subject: Re: Cg/asm doesn't scale
>
> On Thu, Apr 23, 2009 at 11:07 AM, Nguyen, Hung V ERDC-ITL-MS
> <Hung.V.Nguyen at usace.army.mil> wrote:
>
>
>        Hello,
>
>        I tried to solver the SPD linear system with using cg/asm
> preconditioner and
>        found that it doesn't scale well, see table below. Note: it does
> scale well
>        with cg/jacobi preconditioner.
>
>        Do you know why it doesn't scale?
>
>
> ILU is incredibly unpredictable. You have not provided the Jacobi numbers,
> but the particular nonzero pattern of the non-overlapping matrix must be
> much
> more amenable. Also, this is a really really bad preconditioner for your
> system. I would put my time into figuring out why my system is so
> ill-conditioned and try to formulate a good preconditioner, like an
> approximate system, etc.
>
>  Matt
>
>
>
>        Thanks,
>
>        -hung
>
>        Number Pes   Solver Time (secs)  #it
>        1                31.317345         544
>        2               263.172225        6959
>        4             734.828840        23233
>        8             805.217591        41250
>        16            611.813716        49262
>        32            345.331928        49792
>        64            212.084555        53771
>
>
>        ---
>        1 :  aprun -n 1  ./test_matrix_read  -ksp_type cg -pc_type asm
> -pc_asm_type
>        basic   -sub_pc_type ilu -sub_ksp_type preonly -ksp_rtol 1.0e-12
> -ksp_max_it
>        100000
>        Time in PETSc solver: 31.317345 seconds
>        The number of iteration       = 544
>        The solution residual error = 1.658653e-08
>        2 norm 7.885361e-07
>         infinity norm 6.738382e-09
>         1 norm 2.124207e-04
>
>        Application 679466 resources: utime 0, stime 0
>        ************************ Beginning new run ************************
>
>        2 :  aprun -n 2  ./test_matrix_read  -ksp_type cg -pc_type asm
> -pc_asm_type
>        basic   -sub_pc_type ilu -sub_ksp_type preonly -ksp_rtol 1.0e-12
> -ksp_max_it
>        100000
>        Time in PETSc solver: 263.172225 seconds
>        The number of iteration       = 6959
>        The solution residual error = 1.794494e-08
>        2 norm 6.579571e-07
>         infinity norm 8.745052e-09
>         1 norm 1.907733e-04
>
>        -- Here is info about matrix A:
>
>        Computed <structure:nrows> as <178353>
>        Computed <structure:symmetry> as <0>
>        Computed <structure:nnzeros> as <3578321>
>        Computed <structure:max-nnzeros-per-row> as <27>
>        Computed <structure:min-nnzeros-per-row> as <6>
>        Computed <structure:left-bandwidth> as <76553>
>        Computed <structure:right-bandwidth> as <76553>
>
>
>
>
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments
> is infinitely more interesting than any results to which their experiments
> lead.
> -- Norbert Wiener
>
>


-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20090423/a5582b4c/attachment.htm>


More information about the petsc-users mailing list