[petsc-users] Using superlu_dist in a direct solve

Jed Brown jedbrown at mcs.anl.gov
Sun Dec 23 18:56:36 CST 2012


Where is your matrix? It might be ending up with a very bad pivot. If the
problem can be reproduced, it should be reported to the SuperLU_DIST
developers to fix. (Note that we do not see this with other matrices.) You
can also try MUMPS.


On Sun, Dec 23, 2012 at 6:48 PM, Sanjay Govindjee <s_g at berkeley.edu> wrote:

>  I wanted to use SuperLU Dist to perform a direct solve but seem to be
> encountering
> a problem.  I was wonder if this is a know issue and if there is a
> solution for it.
>
> The problem is easily observed using ex6.c in src/ksp/ksp/examples/tests.
>
> Out of the box: make runex6 produces a residual error of O(1e-11), all is
> well.
>
> I then changed the run to run on two processors and add the flag
> -pc_factor_mat_solver_package spooles  this produces a residual error of
> O(1e-11), all is still well.
>
> I then switch over to -pc_factor_mat_solver_package superlu_dist and the
> residual error comes back as 22.6637!  Something seems very wrong.
>
> My build is perfectly vanilla:
>
> export PETSC_DIR=/Users/sg/petsc-3.3-p5/
> export PETSC_ARCH=intel
>
> ./configure --with-cc=icc --with-fc=ifort  \
> -download-{spooles,parmetis,superlu_dist,prometheus,mpich,ml,hypre,metis}
>
> make PETSC_DIR=/Users/sg/petsc-3.3-p5/ PETSC_ARCH=intel all
> make PETSC_DIR=/Users/sg/petsc-3.3-p5/ PETSC_ARCH=intel test
>
> -sanjay
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121223/c4ac90de/attachment.html>


More information about the petsc-users mailing list