<div dir="ltr">On Wed, Jan 2, 2013 at 1:55 PM, Nachiket Gokhale <span dir="ltr"><<a href="mailto:gokhalen@gmail.com" target="_blank">gokhalen@gmail.com</a>></span> wrote:<br><div class="gmail_extra"><div class="gmail_quote">
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Sorry, I wasn't aware of that. Is there any thing that you<br>
particularly recommend for dense LU factorizations? Otherwise, I will<br>
fall back on the default factorizations in petsc - which seem to work,<br>
though I haven't investigated them thoroughly.<br></blockquote><div><br></div><div style>PETSc calls LAPACK which is the obvious thing for serial dense linear algebra.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<span class="HOEnZb"><font color="#888888"><br>
-Nachiket<br>
</font></span><div class="HOEnZb"><div class="h5"><br>
On Wed, Jan 2, 2013 at 2:35 PM, Jed Brown <<a href="mailto:jedbrown@mcs.anl.gov">jedbrown@mcs.anl.gov</a>> wrote:<br>
> On Wed, Jan 2, 2013 at 1:35 PM, Nachiket Gokhale <<a href="mailto:gokhalen@gmail.com">gokhalen@gmail.com</a>> wrote:<br>
>><br>
>> Yes, I did. I got this error message - the configure log shows that I<br>
>> installed mumps.<br>
>><br>
>> 0]PETSC ERROR: --------------------- Error Message<br>
>> ------------------------------------<br>
>> [0]PETSC ERROR: No support for this operation for this object type!<br>
>> [0]PETSC ERROR: Matrix format seqdense does not have a solver package<br>
>> mumps for LU.<br>
><br>
><br>
> MUMPS is not a dense solver.<br>
><br>
>><br>
>> Perhaps you must ./configure with --download-mumps!<br>
>> [0]PETSC ERROR:<br>
>> ------------------------------------------------------------------------<br>
>> [0]PETSC ERROR: Petsc Development HG revision:<br>
>> cc5f6de4d644fb53ec2bbf114fa776073e3e8534 HG Date: Fri Dec 21 11:22:24<br>
>> 2012 -0600<br>
>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>
>> [0]PETSC ERROR: See docs/index.html for manual pages.<br>
>> [0]PETSC ERROR:<br>
>> ------------------------------------------------------------------------<br>
>> [0]PETSC ERROR: /home/gokhale/WAIGEN/GDEB-WAIGEN2012/bin/waiproj on a<br>
>> linux-gcc named <a href="http://asd1.wai.com" target="_blank">asd1.wai.com</a> by gokhale Wed Jan 2 14:29:19 2013<br>
>> [0]PETSC ERROR: Libraries linked from<br>
>> /opt/petsc/petsc-dev/linux-gcc-gpp-mpich-mumps-complex-elemental/lib<br>
>> [0]PETSC ERROR: Configure run at Fri Dec 21 14:30:56 2012<br>
>> [0]PETSC ERROR: Configure options --with-x=0 --with-mpi=1<br>
>> --download-mpich=yes --with-x11=0 --with-debugging=0<br>
>> --with-clanguage=C++ --with-shared-libraries=1 --download-mumps=yes<br>
>> --download-f-blas-lapack=1 --download-parmetis=1 --download-metis<br>
>> --download-scalapack=1 --download-blacs=1<br>
>> --with-cmake=/usr/bin/cmake28 --with-scalar-type=complex<br>
>> --download-elemental<br>
>> [0]PETSC ERROR:<br>
>> ------------------------------------------------------------------------<br>
>> [0]PETSC ERROR: MatGetFactor() line 3944 in<br>
>> /opt/petsc/petsc-dev/src/mat/interface/matrix.c<br>
>> [0]PETSC ERROR: PCSetUp_LU() line 133 in<br>
>> /opt/petsc/petsc-dev/src/ksp/pc/impls/factor/lu/lu.c<br>
>> [0]PETSC ERROR: PCSetUp() line 832 in<br>
>> /opt/petsc/petsc-dev/src/ksp/pc/interface/precon.c<br>
>> [0]PETSC ERROR: KSPSetUp() line 267 in<br>
>> /opt/petsc/petsc-dev/src/ksp/ksp/interface/itfunc.c<br>
>> [0]PETSC ERROR: KSPSolve() line 376 in<br>
>> /opt/petsc/petsc-dev/src/ksp/ksp/interface/itfunc.c<br>
>> [0]PETSC ERROR: main() line 169 in src/examples/waiproj.c<br>
>> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0<br>
>> [cli_0]: aborting job:<br>
>> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0<br>
>><br>
>> -Nachiket<br>
>><br>
>> On Wed, Jan 2, 2013 at 2:33 PM, Jed Brown <<a href="mailto:jedbrown@mcs.anl.gov">jedbrown@mcs.anl.gov</a>> wrote:<br>
>> > Did you try it? Yes, it works.<br>
>> ><br>
>> ><br>
>> > On Wed, Jan 2, 2013 at 1:31 PM, Nachiket Gokhale <<a href="mailto:gokhalen@gmail.com">gokhalen@gmail.com</a>><br>
>> > wrote:<br>
>> >><br>
>> >> Does MUMPS work with PETSC in serial i.e. one MPI process? I need to<br>
>> >> run in serial because I have to perform certain dense matrix<br>
>> >> multiplications which do not work in parallel. If mumps does not<br>
>> >> work, I think I will try superlu.<br>
>> >><br>
>> >> -Nachiket<br>
>> ><br>
>> ><br>
><br>
><br>
</div></div></blockquote></div><br></div></div>