[petsc-users] MatMatSolve in sequential call
Alexander Grayver
agrayver at gfz-potsdam.de
Tue Aug 14 05:17:38 CDT 2012
On 13.08.2012 19:01, Hong Zhang wrote:
> Alexander :
> I need a simple code from you to reproduce the error for debugging.
Hong,
Attached you will find test.c that should reproduce this behavior with
petsc-3.3-p2.
Just run it with 1 and 2 procs as following:
./mpirun -n 1 test -ksp_type preonly -pc_type cholesky
-pc_factor_mat_solver_package mumps -ksp_view
Here is matrix I tested it on (15 MB):
https://dl.dropbox.com/u/60982984/A.dat
> If mumps' solver is activated, then you should get error (sequential
> and parallel)
> [0]PETSC ERROR: No support for this operation for this object type!
> [0]PETSC ERROR: MatMatSolve_MUMPS() is not implemented yet!
>
> If petsc's direct solver is activated, then you should not be able to
> run it in parallel.
Please note, that it works correctly for nproc > 1 since 1 year or so!
So I don't quite understand why there should be error thrown.
I just occasionally decided to test something in sequential version and
this thing happened.
>
> Please check ~petsc/src/mat/examples/tests/ex125.c
> to see if you used same calling procedure as this example.
> Note: I fixed a bug in this example and patched petsc-3.3.
> The update copy of ex125.c is attached for your convenience.
>
> Hong
>
--
Regards,
Alexander
-------------- next part --------------
#include <petscksp.h>
#include <petscviewer.h>
static char help = [];
#undef __FUNCT__
#define __FUNCT__ "main"
int main(int argc,char **args)
{
Vec x,b;
Mat A,B,C,F;
KSP ksp;
PC pc;
PetscRandom rctx;
PetscInt M,N;
PetscErrorCode ierr;
PetscViewer viewer;
PetscInitialize(&argc,&args,(char *)0,help);
ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr);
ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr);
ierr = MatSetFromOptions(A);CHKERRQ(ierr);
ierr = MatLoad(A,viewer);CHKERRQ(ierr);
ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);
MatGetSize(A,&N,&M);
VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,M,&x);
VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,N,&b);
MatCreateDense(PETSC_COMM_WORLD,PETSC_DECIDE,PETSC_DECIDE,N,10,PETSC_NULL,&B);
MatCreateDense(PETSC_COMM_WORLD,PETSC_DECIDE,PETSC_DECIDE,N,10,PETSC_NULL,&C);
PetscRandomCreate(PETSC_COMM_WORLD,&rctx);
PetscRandomSetFromOptions(rctx);
VecSetRandom(b,rctx);
ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr);
ierr = KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN);CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr);
ierr = KSPSolve(ksp,b,x);CHKERRQ(ierr);
ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr);
ierr = PCFactorGetMatrix(pc,&F);CHKERRQ(ierr);
ierr = MatMatSolve(F,B,C);CHKERRQ(ierr);
KSPDestroy(&ksp);
MatDestroy(&A);
MatDestroy(&B);
MatDestroy(&C);
VecDestroy(&x);
VecDestroy(&b);
PetscRandomDestroy(&rctx);
ierr = PetscFinalize();
return 0;
}
More information about the petsc-users
mailing list