On Thu, Apr 23, 2009 at 2:13 PM, Nguyen, Hung V ERDC-ITL-MS <span dir="ltr"><<a href="mailto:Hung.V.Nguyen@usace.army.mil">Hung.V.Nguyen@usace.army.mil</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
Hello Matt,<br>
<br>
>ILU is incredibly unpredictable.<br>
<br>
I got the same result when running without setting ILU in sub_pc_type. It<br>
seems to me that direct solver is set up as default when solving for<br>
sub_ksp_type(?). Please let me know if it is not correct.<br><div class="im"></div></blockquote><div><br>You can see what the default is using -ksp_view<br> </div><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
<div class="im">
>You have not provided the Jacobi numbers, but the particular nonzero pattern<br>
of the non-overlapping matrix must be<br>
>much more amenable. Also, this is a really really bad preconditioner for<br>
your system.<br>
<br>
</div>Indeed, the asm preconditioner is not really bad preconditioner for some of<br>
my ill-conditioned systems. In some other SPD linear systems, I have found<br>
that cg with asm preconditioner converges better than others. And, it does<br>
scale well within the size of matrix, see attached file. However, it doesn't<br>
scale in this case. Here is the solver time for cg/jacobi. The performance of<br>
cg/asm is better than cg/jacobi in the range from 1 to 4 processors.<br>
<br>
Number Pes Solver Time (secs) #it Solver Time(secs)<br>
cg/asm cg/jacobi<br>
1 31.317345 544 1999.276566<br>
2 263.172225 6959 1188.067975<br>
4 734.828840 23233 984.062940<br>
8 805.217591 41250 538.102407<br>
16 611.813716 49262 308.547316<br>
32 345.331928 49792 170.074248<br>
64 212.084555 53771 92.398144<br>
<div class="im"><br>
>I would put my time into figuring out why my system is so ill-conditioned<br>
and try to formulate a good preconditioner, like an approximate system, etc.<br>
<br>
</div>The linear system is from groundwater flow in a water repellent soil that can<br>
cause a very ill-conditioned linear system.</blockquote><div><br>This is why people develop special purpose discretizations for these problems.<br><br> Matt<br> </div><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
<font color="#888888"><br>
-Hung<br>
</font><div><div></div><div class="h5"><br>
-----Original Message-----<br>
From: <a href="mailto:petsc-users-bounces@mcs.anl.gov">petsc-users-bounces@mcs.anl.gov</a><br>
[mailto:<a href="mailto:petsc-users-bounces@mcs.anl.gov">petsc-users-bounces@mcs.anl.gov</a>] On Behalf Of Matthew Knepley<br>
Sent: Thursday, April 23, 2009 11:12 AM<br>
To: PETSc users list<br>
Subject: Re: Cg/asm doesn't scale<br>
<br>
On Thu, Apr 23, 2009 at 11:07 AM, Nguyen, Hung V ERDC-ITL-MS<br>
<<a href="mailto:Hung.V.Nguyen@usace.army.mil">Hung.V.Nguyen@usace.army.mil</a>> wrote:<br>
<br>
<br>
Hello,<br>
<br>
I tried to solver the SPD linear system with using cg/asm<br>
preconditioner and<br>
found that it doesn't scale well, see table below. Note: it does<br>
scale well<br>
with cg/jacobi preconditioner.<br>
<br>
Do you know why it doesn't scale?<br>
<br>
<br>
ILU is incredibly unpredictable. You have not provided the Jacobi numbers,<br>
but the particular nonzero pattern of the non-overlapping matrix must be much<br>
more amenable. Also, this is a really really bad preconditioner for your<br>
system. I would put my time into figuring out why my system is so<br>
ill-conditioned and try to formulate a good preconditioner, like an<br>
approximate system, etc.<br>
<br>
Matt<br>
<br>
<br>
<br>
Thanks,<br>
<br>
-hung<br>
<br>
Number Pes Solver Time (secs) #it<br>
1 31.317345 544<br>
2 263.172225 6959<br>
4 734.828840 23233<br>
8 805.217591 41250<br>
16 611.813716 49262<br>
32 345.331928 49792<br>
64 212.084555 53771<br>
<br>
<br>
---<br>
1 : aprun -n 1 ./test_matrix_read -ksp_type cg -pc_type asm<br>
-pc_asm_type<br>
basic -sub_pc_type ilu -sub_ksp_type preonly -ksp_rtol 1.0e-12<br>
-ksp_max_it<br>
100000<br>
Time in PETSc solver: 31.317345 seconds<br>
The number of iteration = 544<br>
The solution residual error = 1.658653e-08<br>
2 norm 7.885361e-07<br>
infinity norm 6.738382e-09<br>
1 norm 2.124207e-04<br>
<br>
Application 679466 resources: utime 0, stime 0<br>
************************ Beginning new run ************************<br>
<br>
2 : aprun -n 2 ./test_matrix_read -ksp_type cg -pc_type asm<br>
-pc_asm_type<br>
basic -sub_pc_type ilu -sub_ksp_type preonly -ksp_rtol 1.0e-12<br>
-ksp_max_it<br>
100000<br>
Time in PETSc solver: 263.172225 seconds<br>
The number of iteration = 6959<br>
The solution residual error = 1.794494e-08<br>
2 norm 6.579571e-07<br>
infinity norm 8.745052e-09<br>
1 norm 1.907733e-04<br>
<br>
-- Here is info about matrix A:<br>
<br>
Computed <structure:nrows> as <178353><br>
Computed <structure:symmetry> as <0><br>
Computed <structure:nnzeros> as <3578321><br>
Computed <structure:max-nnzeros-per-row> as <27><br>
Computed <structure:min-nnzeros-per-row> as <6><br>
Computed <structure:left-bandwidth> as <76553><br>
Computed <structure:right-bandwidth> as <76553><br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
--<br>
What most experimenters take for granted before they begin their experiments<br>
is infinitely more interesting than any results to which their experiments<br>
lead.<br>
-- Norbert Wiener<br>
<br>
</div></div></blockquote></div><br><br clear="all"><br>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>