[petsc-dev] CHOLESKY with MUMPS doesn't work properly
Hong Zhang
hzhang at mcs.anl.gov
Fri Dec 9 14:43:10 CST 2011
Alexander :
> BTW, in MUMPS after factorization I can get scaling vector (MUMPS scales
> matrix before factorization) easily mumid%ROWSCA.
> This scaling vector was useful for me because of some kind of analysis.
> Would it be possible to get it through PETSc somehow?
I'll try it next week.
Hong
>
>
> On 09.12.2011 18:44, Hong Zhang wrote:
>>
>> Are you sure MUMPS is installed with petsc correctly?
>>
>>> [0]PETSC ERROR: No support for this operation for this object type!
>>> [0]PETSC ERROR: Matrix format mpiaij does not have a built-in PETSc
>>> CHOLESKY!
>>
>> Uses '-ksp_view' to check which solver is being used.
>> Seems it calls petsc cholesky with mpiaij matrix.
>>
>> Petsc-dev supports mumps cholesky with mpiaij format, e.g.,
>> petsc-dev/src/ksp/ksp/examples/tutorials>mpiexec -n 2 ./ex2 -pc_type
>> cholesky -pc_factor_mat_solver_package mumps -mat_type mpiaij
>> Norm of error 1.53436e-15 iterations 1
>>
>> HOng
>>
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: Petsc Development HG revision:
>>> a4f22f4fcb371a016e046e537076dcfd2ce5087f HG Date: Fri Dec 09 09:08:30
>>> 2011
>>> -0600
>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>> [0]PETSC ERROR: See docs/index.html for manual pages.
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: /home/model on a openmpi-i named node233 by agrayver Fri
>>> Dec
>>> 9 17:23:30 2011
>>> [0]PETSC ERROR: Libraries linked from
>>> /home/lib/petsc-dev/openmpi-intel-complex-release-f-ds/lib
>>> [0]PETSC ERROR: Configure run at Fri Dec 9 16:51:10 2011
>>> [0]PETSC ERROR: Configure options --download-metis --download-mumps
>>> --download-parmetis --download-superlu_dist
>>> --with-blacs-include=/opt/intel/Compiler/11.1/072/mkl/include
>>>
>>> --with-blacs-lib=/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_blacs_openmpi_lp64.a
>>> --with-blas-lapack-dir=/opt/intel/Compiler/11.1/072/mkl/lib/em64t
>>> --with-debugging=0 --with-fortran-interfaces=1 --with-fortran-kernels=1
>>> --with-mpi-dir=/opt/mpi/intel/openmpi-1.4.2
>>> --with-petsc-arch=openmpi-intel-complex-release-f-ds
>>> --with-precision=double
>>> --with-scalapack-include=/opt/intel/Compiler/11.1/072/mkl/include
>>>
>>> --with-scalapack-lib=/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_scalapack_lp64.a
>>> --with-scalar-type=complex --with-x=0
>>> PETSC_ARCH=openmpi-intel-complex-release-f-ds
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: MatGetFactor() line 3943 in
>>> /home/lib/petsc-dev/src/mat/interface/matrix.c
>>> [0]PETSC ERROR: PCFactorSetUpMatSolverPackage_Factor() line 17 in
>>> /home/lib/petsc-dev/src/ksp/pc/impls/factor/factimpl.c
>>> [0]PETSC ERROR: PCFactorSetUpMatSolverPackage() line 26 in
>>> /home/lib/petsc-dev/src/ksp/pc/impls/factor/factor.c
>>>
>>> Any idea?
>>>
>>> Regards,
>>> Alexander
>
>
More information about the petsc-dev
mailing list