a problem with error of "An error occurred in MPI_Comm_rank after MPI was finalized"
Yin Feng
yfeng1 at tigers.lsu.edu
Sun Oct 11 22:12:41 CDT 2009
I know the function called CHKERRQ, but what does CHKMEMQ mean?
On Sun, Oct 11, 2009 at 10:06 PM, Yaakoub El Khamra <yelkhamra at gmail.com>wrote:
> you need to set types for matrices. You also want to check the errors
> from the petsc functions. The function is CHKERRQ you also might want
> to do a CHKMEMQ.
>
>
> Regards
> Yaakoub El Khamra
>
>
>
>
> On Sun, Oct 11, 2009 at 9:59 PM, Yin Feng <yfeng1 at tigers.lsu.edu> wrote:
> > The completed code is:
> >
> > void PETSc(double mat[],double vec[],double sol[],long N,double tol) {
> >
> > Vec x,b;
> >
> > Mat A;
> >
> > KSP ksp;
> >
> > PC pc;
> >
> > PetscInt i,j,col[N];
> >
> > PetscScalar value[N];
> >
> > PetscScalar val;
> >
> > PetscInitialize(PETSC_NULL,PETSC_NULL,PETSC_NULL,PETSC_NULL);
> >
> > VecCreate(PETSC_COMM_WORLD,&x);
> >
> > VecSetSizes(x,PETSC_DECIDE,N);
> >
> > VecSetFromOptions(x);
> >
> > VecDuplicate(x,&b);
> >
> > MatCreate(PETSC_COMM_WORLD,&A);
> >
> > MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,N,N);
> >
> > MatSetFromOptions(A);
> >
> > for (i=0; i<N; i++) {
> >
> > for (j=0;j<N;j++) {
> >
> > value[j]=mat[i*N+j];
> >
> > col[j]=j;
> >
> > }
> >
> > MatSetValues(A,1,&i,N,col,value,INSERT_VALUES);
> >
> > }
> >
> > MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY);
> >
> > MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY);
> >
> >
> >
> > for (i=0; i<N; i++) {
> >
> > val=vec[i];
> >
> > VecSetValues(b,1,&i,&val,INSERT_VALUES);
> >
> > val=sol[i];
> >
> > VecSetValues(x,1,&i,&val,INSERT_VALUES);
> >
> > }
> >
> > VecAssemblyBegin(b);
> >
> > VecAssemblyEnd(b);
> >
> > KSPCreate(PETSC_COMM_WORLD,&ksp);
> >
> > KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN);
> >
> > KSPGetPC(ksp,&pc);
> >
> > KSPSetType(ksp, KSPGMRES);
> >
> > PCSetType(pc, PCBJACOBI);
> >
> > KSPSetTolerances(ksp,tol,PETSC_DEFAULT,PETSC_DEFAULT,PETSC_DEFAULT);
> >
> > KSPSetFromOptions(ksp);
> >
> > KSPSolve(ksp,b,x);
> >
> > for (i=0;i<N;i++) {
> >
> > VecGetValues(x,1,&i,&sol[i]);
> >
> > }
> >
> >
> >
> > VecDestroy(x);
> >
> > VecDestroy(b);
> >
> > MatDestroy(A);
> >
> > KSPDestroy(ksp);
> >
> > PetscFinalize();
> >
> > }
> >
> > Once this function is call in a loop, it reports error.
> > On Sun, Oct 11, 2009 at 9:38 PM, Yin Feng <yfeng1 at tigers.lsu.edu> wrote:
> >>
> >> I put PETSc solver in one function and use another function to call
> that.
> >> This problem only appears when I put the function with PETSc solver in a
> >> loop,
> >> it works well at first step, and reports error "An error occurred in
> >> MPI_Comm_rank after MPI was finalized"
> >> at second time. The program is designed to support only one processor
> >> like:
> >>
> >> Vec x,b;
> >>
> >> Mat A;
> >>
> >> KSP ksp;
> >>
> >> PC pc;
> >>
> >> PetscInt i,j,col[N];
> >>
> >> PetscScalar value[N];
> >>
> >> PetscScalar val;
> >>
> >> PetscInitialize(PETSC_NULL,PETSC_NULL,PETSC_NULL,PETSC_NULL);
> >>
> >> VecCreate(PETSC_COMM_WORLD,&x);
> >>
> >> VecSetSizes(x,PETSC_DECIDE,N);
> >>
> >> VecSetFromOptions(x);
> >>
> >> VecDuplicate(x,&b);
> >>
> >> MatCreate(PETSC_COMM_WORLD,&A);
> >>
> >> MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,N,N);
> >>
> >> MatSetFromOptions(A);
> >>
> >> ........
> >>
> >> ........
> >>
> >> KSPCreate(PETSC_COMM_WORLD,&ksp);
> >>
> >> KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN);
> >>
> >> KSPGetPC(ksp,&pc);
> >>
> >> ................
> >> ...............
> >>
> >>
> >> KSPSetTolerances(ksp,tol,PETSC_DEFAULT,PETSC_DEFAULT,PETSC_DEFAULT);
> >>
> >> KSPSetFromOptions(ksp);
> >>
> >> KSPSolve(ksp,b,x);
> >>
> >> VecDestroy(x);
> >>
> >> VecDestroy(b);
> >>
> >> MatDestroy(A);
> >>
> >> KSPDestroy(ksp);
> >>
> >> PetscFinalize();
> >>
> >> Any one has ideal about this?
> >>
> >> The detailed error description is:
> >>
> >> An error occurred in MPI_Comm_rank
> >>
> >> *** after MPI was finalized
> >>
> >> *** MPI_ERRORS_ARE_FATAL (goodbye)
> >>
> >> Abort before MPI_INIT completed successfully; not able to guarantee that
> >> all other processes were killed!
> >>
> >> Thank you so much in advance!
> >
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20091011/2cb018db/attachment.htm>
More information about the petsc-users
mailing list