[petsc-users] Using superlu_dist in a direct solve

Jed Brown jedbrown at mcs.anl.gov
Sun Dec 23 20:26:17 CST 2012


On Sun, Dec 23, 2012 at 8:15 PM, Sanjay Govindjee <s_g at berkeley.edu> wrote:

>  Sorry for the confusion.  I thought I was clear.  Here is the make line I
> was running.
>
>
>         -@${MPIEXEC} -n 2 ./ex6 -ksp_type preonly  -pc_type lu
> -pc_factor_mat_solver_package superlu_dist -options_left no \
>            -f arco1 > ex6_1.tmp 2>&1; \
>            if (${DIFF} output/ex6_1.out ex6_1.tmp) then true; \
>            else echo ${PWD} ; echo "Possible problem with with ex6_1,
> diffs above \n========================================="; fi; \
>            ${RM} -f ex6_1.tmp
>
> If you change superlu_dist to spooles it works just fine as well as any
> other iterative methods you care to try.  The matrix arcos1 was downloaded
> as per the instructions in the makefile.
>

I cannot reproduce your problem. Do you have a build with a different
compiler (like GCC)? Also, what BLAS/LAPACK is being used? (You can send
configure.log to petsc-maint at mcs.anl.gov.)


> I will try reproducing the superlu_dist error with
> snes/examples/tutorials/ex5 now.
>

This is the file Matt suggested.


> (fyi under snes/examples/tests/output the files ex5_1.out and ex5_2.out
> are missing one can not run the test out of the box).
>

Heh, this has been missing since the beginning of time (revision 0). I'll
add it.


>
> -sanjay
>
>
>
> On 12/23/12 6:07 PM, Jed Brown wrote:
>
> You didn't say what options you were running ex6 with, but with the
> options used for the tests, I see
>
> ~/petsc/src/ksp/ksp/examples/tests$ mpirun.hydra -n 2 ./ex6 -f
> ~/petsc/datafiles/matrices/arco1 -pc_type lu -pc_factor_mat_solver_package
> superlu_dist
> Number of iterations =   1
> Residual norm = 2.23439e-11
>
>
> You need to give precise instructions for how to reproduce the behavior
> you are seeing.
>
> Also, for experimenting with matrices read from files, we prefer
> src/ksp/ksp/examples/tutorials/ex10.c because it is better commented and
> has more features.
>
>
> On Sun, Dec 23, 2012 at 7:08 PM, Sanjay Govindjee <s_g at berkeley.edu>wrote:
>
>>  Not sure what you mean by where is your matrix?  I am simply running ex6
>> in the ksp/examples/tests directory.
>>
>> The reason I ran this test is because I was seeing the same behavior with
>> my finite element code (on perfectly benign problems).
>>
>> Is there a built-in test that you use to check that superlu_dist is
>> working properly with petsc?
>> i.e. something you know that works with with petsc 3.3-p5?
>>
>> -sanjay
>>
>>
>>
>> On 12/23/12 4:56 PM, Jed Brown wrote:
>>
>> Where is your matrix? It might be ending up with a very bad pivot. If the
>> problem can be reproduced, it should be reported to the SuperLU_DIST
>> developers to fix. (Note that we do not see this with other matrices.) You
>> can also try MUMPS.
>>
>>
>> On Sun, Dec 23, 2012 at 6:48 PM, Sanjay Govindjee <s_g at berkeley.edu>wrote:
>>
>>>  I wanted to use SuperLU Dist to perform a direct solve but seem to be
>>> encountering
>>> a problem.  I was wonder if this is a know issue and if there is a
>>> solution for it.
>>>
>>> The problem is easily observed using ex6.c in src/ksp/ksp/examples/tests.
>>>
>>> Out of the box: make runex6 produces a residual error of O(1e-11), all
>>> is well.
>>>
>>> I then changed the run to run on two processors and add the flag
>>> -pc_factor_mat_solver_package spooles  this produces a residual error of
>>> O(1e-11), all is still well.
>>>
>>> I then switch over to -pc_factor_mat_solver_package superlu_dist and the
>>> residual error comes back as 22.6637!  Something seems very wrong.
>>>
>>> My build is perfectly vanilla:
>>>
>>> export PETSC_DIR=/Users/sg/petsc-3.3-p5/
>>> export PETSC_ARCH=intel
>>>
>>> ./configure --with-cc=icc --with-fc=ifort  \
>>> -download-{spooles,parmetis,superlu_dist,prometheus,mpich,ml,hypre,metis}
>>>
>>> make PETSC_DIR=/Users/sg/petsc-3.3-p5/ PETSC_ARCH=intel all
>>> make PETSC_DIR=/Users/sg/petsc-3.3-p5/ PETSC_ARCH=intel test
>>>
>>> -sanjay
>>>
>>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121223/fbcc1b7b/attachment-0001.html>


More information about the petsc-users mailing list