[petsc-dev] CHOLESKY with MUMPS doesn't work properly
Matthew Knepley
knepley at gmail.com
Fri Dec 9 11:17:14 CST 2011
On Fri, Dec 9, 2011 at 10:31 AM, Alexander Grayver
<agrayver at gfz-potsdam.de>wrote:
> Hi dev-team,
>
> I have this code:
>
> call KSPCreate(comm3d,ksp,ierr);**CHKERRQ(ierr)
> call KSPSetOperators(ksp,A,A,**DIFFERENT_NONZERO_PATTERN,**
> ierr);CHKERRQ(ierr)
> call KSPSetType(ksp,KSPPREONLY,**ierr);CHKERRQ(ierr)
> call KSPGetPC(ksp,pc,ierr);CHKERRQ(**ierr)
> call PCSetType(pc,PCCHOLESKY,ierr);**CHKERRQ(ierr)
> call PCFactorSetMatSolverPackage(**pc,MATSOLVERMUMPS,ierr);**
> CHKERRQ(ierr)
> call PCFactorSetUpMatSolverPackage(**pc,ierr);CHKERRQ(ierr)
> call PCFactorGetMatrix(pc,F,ierr);**CHKERRQ(ierr)
>
> which works well under petsc-3.2-p5, but produces error with petsc-dev:
>
I believe that Cholesky only works with SBAIJ not AIJ.
Matt
> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------**------
> [0]PETSC ERROR: No support for this operation for this object type!
> [0]PETSC ERROR: Matrix format mpiaij does not have a built-in PETSc
> CHOLESKY!
> [0]PETSC ERROR: ------------------------------**
> ------------------------------**------------
> [0]PETSC ERROR: Petsc Development HG revision:
> a4f22f4fcb371a016e046e537076dc**fd2ce5087f HG Date: Fri Dec 09 09:08:30
> 2011 -0600
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR: ------------------------------**
> ------------------------------**------------
> [0]PETSC ERROR: /home/model on a openmpi-i named node233 by agrayver Fri
> Dec 9 17:23:30 2011
> [0]PETSC ERROR: Libraries linked from /home/lib/petsc-dev/openmpi-**
> intel-complex-release-f-ds/lib
> [0]PETSC ERROR: Configure run at Fri Dec 9 16:51:10 2011
> [0]PETSC ERROR: Configure options --download-metis --download-mumps
> --download-parmetis --download-superlu_dist --with-blacs-include=/opt/**
> intel/Compiler/11.1/072/mkl/**include --with-blacs-lib=/opt/intel/**
> Compiler/11.1/072/mkl/lib/**em64t/libmkl_blacs_openmpi_**lp64.a
> --with-blas-lapack-dir=/opt/**intel/Compiler/11.1/072/mkl/**lib/em64t
> --with-debugging=0 --with-fortran-interfaces=1 --with-fortran-kernels=1
> --with-mpi-dir=/opt/mpi/intel/**openmpi-1.4.2 --with-petsc-arch=openmpi-**intel-complex-release-f-ds
> --with-precision=double --with-scalapack-include=/opt/**
> intel/Compiler/11.1/072/mkl/**include --with-scalapack-lib=/opt/**
> intel/Compiler/11.1/072/mkl/**lib/em64t/libmkl_scalapack_**lp64.a
> --with-scalar-type=complex --with-x=0 PETSC_ARCH=openmpi-intel-**
> complex-release-f-ds
> [0]PETSC ERROR: ------------------------------**
> ------------------------------**------------
> [0]PETSC ERROR: MatGetFactor() line 3943 in /home/lib/petsc-dev/src/mat/**
> interface/matrix.c
> [0]PETSC ERROR: PCFactorSetUpMatSolverPackage_**Factor() line 17 in
> /home/lib/petsc-dev/src/ksp/**pc/impls/factor/factimpl.c
> [0]PETSC ERROR: PCFactorSetUpMatSolverPackage(**) line 26 in
> /home/lib/petsc-dev/src/ksp/**pc/impls/factor/factor.c
>
> Any idea?
>
> Regards,
> Alexander
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20111209/4088feec/attachment.html>
More information about the petsc-dev
mailing list