[petsc-users] MUMPS in serial

Jed Brown jedbrown at mcs.anl.gov
Wed Jan 2 13:57:03 CST 2013


On Wed, Jan 2, 2013 at 1:55 PM, Nachiket Gokhale <gokhalen at gmail.com> wrote:

> Sorry, I  wasn't aware of that. Is there any thing that you
> particularly recommend for dense LU factorizations? Otherwise, I will
> fall back on the default factorizations in petsc - which seem to work,
> though I haven't investigated them thoroughly.
>

PETSc calls LAPACK which is the obvious thing for serial dense linear
algebra.


>
>  -Nachiket
>
> On Wed, Jan 2, 2013 at 2:35 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
> > On Wed, Jan 2, 2013 at 1:35 PM, Nachiket Gokhale <gokhalen at gmail.com>
> wrote:
> >>
> >> Yes, I did.  I got this error message - the configure log shows that I
> >> installed mumps.
> >>
> >> 0]PETSC ERROR: --------------------- Error Message
> >> ------------------------------------
> >> [0]PETSC ERROR: No support for this operation for this object type!
> >> [0]PETSC ERROR: Matrix format seqdense does not have a solver package
> >> mumps for LU.
> >
> >
> > MUMPS is not a dense solver.
> >
> >>
> >> Perhaps you must ./configure with --download-mumps!
> >> [0]PETSC ERROR:
> >> ------------------------------------------------------------------------
> >> [0]PETSC ERROR: Petsc Development HG revision:
> >> cc5f6de4d644fb53ec2bbf114fa776073e3e8534  HG Date: Fri Dec 21 11:22:24
> >> 2012 -0600
> >> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> >> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> >> [0]PETSC ERROR: See docs/index.html for manual pages.
> >> [0]PETSC ERROR:
> >> ------------------------------------------------------------------------
> >> [0]PETSC ERROR: /home/gokhale/WAIGEN/GDEB-WAIGEN2012/bin/waiproj on a
> >> linux-gcc named asd1.wai.com by gokhale Wed Jan  2 14:29:19 2013
> >> [0]PETSC ERROR: Libraries linked from
> >> /opt/petsc/petsc-dev/linux-gcc-gpp-mpich-mumps-complex-elemental/lib
> >> [0]PETSC ERROR: Configure run at Fri Dec 21 14:30:56 2012
> >> [0]PETSC ERROR: Configure options --with-x=0 --with-mpi=1
> >> --download-mpich=yes --with-x11=0 --with-debugging=0
> >> --with-clanguage=C++ --with-shared-libraries=1 --download-mumps=yes
> >> --download-f-blas-lapack=1 --download-parmetis=1 --download-metis
> >> --download-scalapack=1 --download-blacs=1
> >> --with-cmake=/usr/bin/cmake28 --with-scalar-type=complex
> >> --download-elemental
> >> [0]PETSC ERROR:
> >> ------------------------------------------------------------------------
> >> [0]PETSC ERROR: MatGetFactor() line 3944 in
> >> /opt/petsc/petsc-dev/src/mat/interface/matrix.c
> >> [0]PETSC ERROR: PCSetUp_LU() line 133 in
> >> /opt/petsc/petsc-dev/src/ksp/pc/impls/factor/lu/lu.c
> >> [0]PETSC ERROR: PCSetUp() line 832 in
> >> /opt/petsc/petsc-dev/src/ksp/pc/interface/precon.c
> >> [0]PETSC ERROR: KSPSetUp() line 267 in
> >> /opt/petsc/petsc-dev/src/ksp/ksp/interface/itfunc.c
> >> [0]PETSC ERROR: KSPSolve() line 376 in
> >> /opt/petsc/petsc-dev/src/ksp/ksp/interface/itfunc.c
> >> [0]PETSC ERROR: main() line 169 in src/examples/waiproj.c
> >> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0
> >> [cli_0]: aborting job:
> >> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0
> >>
> >> -Nachiket
> >>
> >> On Wed, Jan 2, 2013 at 2:33 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
> >> > Did you try it? Yes, it works.
> >> >
> >> >
> >> > On Wed, Jan 2, 2013 at 1:31 PM, Nachiket Gokhale <gokhalen at gmail.com>
> >> > wrote:
> >> >>
> >> >> Does MUMPS work with PETSC in serial i.e.  one MPI process? I need to
> >> >> run in serial because I have to perform certain dense matrix
> >> >> multiplications which do not work in parallel.  If mumps does not
> >> >> work, I think I will try superlu.
> >> >>
> >> >>  -Nachiket
> >> >
> >> >
> >
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130102/b74d4ae0/attachment-0001.html>


More information about the petsc-users mailing list