a problem with error of "An error occurred in MPI_Comm_rank after MPI was finalized"

Matthew Knepley knepley at gmail.com
Sun Oct 11 23:20:55 CDT 2009


Quick guess is that you do not preallocate a matrix correctly.

  Matt

On Sun, Oct 11, 2009 at 11:01 PM, Yin Feng <yfeng1 at tigers.lsu.edu> wrote:

> I move PetscInitialize and PetscFinalize out of loop and it works now.
> Thank you, Barry!
>
> While, I found it runs very slow, do you know the reason of it?
>
>
> On Sun, Oct 11, 2009 at 10:06 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>>
>>  MPI does NOT allow calling MPI_Init() twice. (By default
>> PetscInitialize() calls MPI_Init().)
>>
>>  You need to take the PetscInitialize() out of the subroutine and instead
>> call it once at the beginning of your program.
>>
>>   Barry
>>
>>
>> On Oct 11, 2009, at 9:59 PM, Yin Feng wrote:
>>
>>  The completed code is:
>>>
>>> void PETSc(double mat[],double vec[],double sol[],long N,double tol) {
>>>        Vec            x,b;
>>>        Mat            A;
>>>        KSP            ksp;
>>>        PC             pc;
>>>        PetscInt       i,j,col[N];
>>>        PetscScalar    value[N];
>>>        PetscScalar    val;
>>>
>>>        PetscInitialize(PETSC_NULL,PETSC_NULL,PETSC_NULL,PETSC_NULL);
>>>        VecCreate(PETSC_COMM_WORLD,&x);
>>>        VecSetSizes(x,PETSC_DECIDE,N);
>>>        VecSetFromOptions(x);
>>>        VecDuplicate(x,&b);
>>>
>>>        MatCreate(PETSC_COMM_WORLD,&A);
>>>        MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,N,N);
>>>        MatSetFromOptions(A);
>>>
>>>        for (i=0; i<N; i++) {
>>>                for (j=0;j<N;j++) {
>>>                        value[j]=mat[i*N+j];
>>>                        col[j]=j;
>>>                }
>>>                MatSetValues(A,1,&i,N,col,value,INSERT_VALUES);
>>>        }
>>>
>>>        MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY);
>>>        MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY);
>>>
>>>        for (i=0; i<N; i++) {
>>>                val=vec[i];
>>>                VecSetValues(b,1,&i,&val,INSERT_VALUES);
>>>                val=sol[i];
>>>                VecSetValues(x,1,&i,&val,INSERT_VALUES);
>>>        }
>>>
>>>        VecAssemblyBegin(b);
>>>        VecAssemblyEnd(b);
>>>
>>>        KSPCreate(PETSC_COMM_WORLD,&ksp);
>>>        KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN);
>>>        KSPGetPC(ksp,&pc);
>>>        KSPSetType(ksp, KSPGMRES);
>>>        PCSetType(pc,  PCBJACOBI);
>>>
>>>  KSPSetTolerances(ksp,tol,PETSC_DEFAULT,PETSC_DEFAULT,PETSC_DEFAULT);
>>>  KSPSetFromOptions(ksp);
>>>  KSPSolve(ksp,b,x);
>>>
>>>  for (i=0;i<N;i++) {
>>>      VecGetValues(x,1,&i,&sol[i]);
>>>  }
>>>
>>>  VecDestroy(x);
>>>  VecDestroy(b);
>>>  MatDestroy(A);
>>>  KSPDestroy(ksp);
>>>  PetscFinalize();
>>> }
>>>
>>> Once this function is call in a loop, it reports error.
>>>
>>> On Sun, Oct 11, 2009 at 9:38 PM, Yin Feng <yfeng1 at tigers.lsu.edu> wrote:
>>> I put PETSc solver in one function and use another function to call that.
>>> This problem only appears when I put the function with PETSc solver in a
>>> loop,
>>> it works well at first step, and reports error "An error occurred in
>>> MPI_Comm_rank after MPI was finalized"
>>> at second time. The program is designed to support only one processor
>>> like:
>>>
>>> Vec            x,b;
>>> Mat            A;
>>> KSP            ksp;
>>> PC             pc;
>>> PetscInt       i,j,col[N];
>>> PetscScalar    value[N];
>>> PetscScalar    val;
>>>
>>> PetscInitialize(PETSC_NULL,PETSC_NULL,PETSC_NULL,PETSC_NULL);
>>> VecCreate(PETSC_COMM_WORLD,&x);
>>> VecSetSizes(x,PETSC_DECIDE,N);
>>> VecSetFromOptions(x);
>>> VecDuplicate(x,&b);
>>> MatCreate(PETSC_COMM_WORLD,&A);
>>> MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,N,N);
>>> MatSetFromOptions(A);
>>>        ........
>>>        ........
>>> KSPCreate(PETSC_COMM_WORLD,&ksp);
>>> KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN);
>>> KSPGetPC(ksp,&pc);
>>>
>>>        ................
>>>        ...............
>>>
>>>  KSPSetTolerances(ksp,tol,PETSC_DEFAULT,PETSC_DEFAULT,PETSC_DEFAULT);
>>>        KSPSetFromOptions(ksp);
>>>        KSPSolve(ksp,b,x);
>>>
>>>        VecDestroy(x);
>>>        VecDestroy(b);
>>>        MatDestroy(A);
>>>        KSPDestroy(ksp);
>>>        PetscFinalize();
>>>
>>> Any one has ideal about this?
>>> The detailed error description is:
>>> An error occurred in MPI_Comm_rank
>>> *** after MPI was finalized
>>> *** MPI_ERRORS_ARE_FATAL (goodbye)
>>> Abort before MPI_INIT completed successfully; not able to guarantee that
>>> all other processes were killed!
>>>
>>> Thank you so much in advance!
>>>
>>>
>>>
>>>
>>
>


-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20091011/92b69018/attachment.htm>


More information about the petsc-users mailing list