[petsc-users] Using superlu_dist in a direct solve

Sanjay Govindjee s_g at berkeley.edu
Wed Dec 26 15:13:54 CST 2012


I have done some more testing of the problem, continuing with 
src/ksp/ksp/examples/tutorials/ex2.c.

The behavior I am seeing is that with smaller problems sizes 
superlu_dist is behaving properly
but with larger problem sizes things seem to go wrong and what goes 
wrong is apparently consistent; the error appears both with my intel 
build as well as with my gcc build.

I have two run lines:

runex2superlu:
         -@${MPIEXEC} -n 2 ./ex2 -ksp_monitor_short -m 100 -n 100 
-ksp_type preonly -pc_type lu -pc_factor_mat_solver_package superlu_dist

runex2spooles:
         -@${MPIEXEC} -n 2 ./ex2 -ksp_monitor_short -m 100 -n 100 
-ksp_type preonly -pc_type lu -pc_factor_mat_solver_package spooles

 From my intel build, I get

sg-macbook-prolocal:tutorials sg$ make runex2superlu
Norm of error 7.66145e-13 iterations 1
sg-macbook-prolocal:tutorials sg$ make runex2spooles
Norm of error 2.21422e-12 iterations 1

 From my GCC build, I get
sg-macbook-prolocal:tutorials sg$ make runex2superlu
Norm of error 7.66145e-13 iterations 1
sg-macbook-prolocal:tutorials sg$ make runex2spooles
Norm of error 2.21422e-12 iterations 1

If I change the -m 100 -n 100 to -m 500 -n 500, I get for my intel build

sg-macbook-prolocal:tutorials sg$ make runex2superlu
Norm of error 419.953 iterations 1
sg-macbook-prolocal:tutorials sg$ make runex2spooles
Norm of error 2.69468e-10 iterations 1

 From my GCC build with -m 500 -n 500, I get

sg-macbook-prolocal:tutorials sg$ make runex2superlu
Norm of error 419.953 iterations 1
sg-macbook-prolocal:tutorials sg$ make runex2spooles
Norm of error 2.69468e-10 iterations 1


Any suggestions will be greatly appreciated.

-sanjay






On 12/23/12 6:42 PM, Matthew Knepley wrote:
>
> On Sun, Dec 23, 2012 at 9:37 PM, Sanjay Govindjee <s_g at berkeley.edu 
> <mailto:s_g at berkeley.edu>> wrote:
>
>     I decided to go with ksp/ksp/exampeles/tutorials/ex2.c; I was
>     unsure how to convert the run lines for snes/examples/ex5.c to
>     work with a direct solver as I am not versed in SNES options.
>
>     Notwithstanding something strange is happening only on select
>     examples.  With ksp/ksp/exampeles/tutorials/ex2.c and the run line:
>
>     -@${MPIEXEC} -n 2 ./ex2 -ksp_monitor_short -m 20 -n 20 -ksp_type
>     preonly -pc_type lu -pc_factor_mat_solver_package superlu_dist
>
>     I get good results (of the order):
>
>     Norm of error 1.85464e-14 iterations 1
>
>     using both superlu_dist and spooles.
>
>     My BLAS/LAPACK: -llapack -lblas (so native to my machine).
>
>     If you can guide me on a run line for the snes ex5.c I can try
>     that too.  I'll also try to construct a GCC build later to see if
>     that is an issue.
>
>
> Same line on ex5, but ex2 is good enough. However, it will not tell us 
> anything new. Try another build.
>
>    Matt
>
>     -sanjay
>
>
>     On 12/23/12 5:58 PM, Matthew Knepley wrote:
>>     On Sun, Dec 23, 2012 at 8:08 PM, Sanjay Govindjee
>>     <s_g at berkeley.edu <mailto:s_g at berkeley.edu>> wrote:
>>
>>         Not sure what you mean by where is your matrix? I am simply
>>         running ex6 in the ksp/examples/tests directory.
>>
>>         The reason I ran this test is because I was seeing the same
>>         behavior with my finite element code (on perfectly benign
>>         problems).
>>
>>         Is there a built-in test that you use to check that
>>         superlu_dist is working properly with petsc?
>>         i.e. something you know that works with with petsc 3.3-p5?
>>
>>
>>     1) Run it on a SNES ex5 (or KSP ex2), which is a nice Laplacian
>>
>>     2) Compare with MUMPS
>>
>>        Matt
>>
>>         -sanjay
>>
>>
>>
>>         On 12/23/12 4:56 PM, Jed Brown wrote:
>>>         Where is your matrix? It might be ending up with a very bad
>>>         pivot. If the problem can be reproduced, it should be
>>>         reported to the SuperLU_DIST developers to fix. (Note that
>>>         we do not see this with other matrices.) You can also try MUMPS.
>>>
>>>
>>>         On Sun, Dec 23, 2012 at 6:48 PM, Sanjay Govindjee
>>>         <s_g at berkeley.edu <mailto:s_g at berkeley.edu>> wrote:
>>>
>>>             I wanted to use SuperLU Dist to perform a direct solve
>>>             but seem to be encountering
>>>             a problem.  I was wonder if this is a know issue and if
>>>             there is a solution for it.
>>>
>>>             The problem is easily observed using ex6.c in
>>>             src/ksp/ksp/examples/tests.
>>>
>>>             Out of the box: make runex6 produces a residual error of
>>>             O(1e-11), all is well.
>>>
>>>             I then changed the run to run on two processors and add
>>>             the flag
>>>             -pc_factor_mat_solver_package spooles  this produces a
>>>             residual error of O(1e-11), all is still well.
>>>
>>>             I then switch over to -pc_factor_mat_solver_package
>>>             superlu_dist and the
>>>             residual error comes back as 22.6637!  Something seems
>>>             very wrong.
>>>
>>>             My build is perfectly vanilla:
>>>
>>>             export PETSC_DIR=/Users/sg/petsc-3.3-p5/
>>>             export PETSC_ARCH=intel
>>>
>>>             ./configure --with-cc=icc --with-fc=ifort  \
>>>             -download-{spooles,parmetis,superlu_dist,prometheus,mpich,ml,hypre,metis}
>>>
>>>             make PETSC_DIR=/Users/sg/petsc-3.3-p5/ PETSC_ARCH=intel all
>>>             make PETSC_DIR=/Users/sg/petsc-3.3-p5/ PETSC_ARCH=intel test
>>>
>>>             -sanjay
>>>
>>>
>>
>>         -- 
>>         -----------------------------------------------
>>         Sanjay Govindjee, PhD, PE
>>         Professor of Civil Engineering
>>         Vice Chair for Academic Affairs
>>
>>         779 Davis Hall
>>         Structural Engineering, Mechanics and Materials
>>         Department of Civil Engineering
>>         University of California
>>         Berkeley, CA 94720-1710
>>
>>         Voice:+1 510 642 6060  <tel:%2B1%20510%20642%206060>
>>         FAX:+1 510 643 5264  <tel:%2B1%20510%20643%205264>
>>         s_g at berkeley.edu  <mailto:s_g at berkeley.edu>
>>         http://www.ce.berkeley.edu/~sanjay  <http://www.ce.berkeley.edu/%7Esanjay>
>>         -----------------------------------------------
>>
>>         New Books:
>>
>>         Engineering Mechanics of Deformable
>>         Solids: A Presentation with Exercises
>>         http://www.oup.com/us/catalog/general/subject/Physics/MaterialsScience/?view=usa&ci=9780199651641
>>         http://ukcatalogue.oup.com/product/9780199651641.do
>>         http://amzn.com/0199651647
>>
>>
>>         Engineering Mechanics 3 (Dynamics)
>>         http://www.springer.com/materials/mechanics/book/978-3-642-14018-1
>>         http://amzn.com/3642140181
>>
>>         -----------------------------------------------
>>
>>
>>
>>
>>     -- 
>>     What most experimenters take for granted before they begin their
>>     experiments is infinitely more interesting than any results to
>>     which their experiments lead.
>>     -- Norbert Wiener
>
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121226/d1af9014/attachment-0001.html>


More information about the petsc-users mailing list