I move PetscInitialize and PetscFinalize out of loop and it works now. <br>Thank you, Barry!<br><br>While, I found it runs very slow, do you know the reason of it?<br><br><div class="gmail_quote">On Sun, Oct 11, 2009 at 10:06 PM, Barry Smith <span dir="ltr"><<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;"><br>
MPI does NOT allow calling MPI_Init() twice. (By default PetscInitialize() calls MPI_Init().)<br>
<br>
You need to take the PetscInitialize() out of the subroutine and instead call it once at the beginning of your program.<br><font color="#888888">
<br>
Barry</font><div><div></div><div class="h5"><br>
<br>
On Oct 11, 2009, at 9:59 PM, Yin Feng wrote:<br>
<br>
<blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
The completed code is:<br>
<br>
void PETSc(double mat[],double vec[],double sol[],long N,double tol) {<br>
Vec x,b;<br>
Mat A;<br>
KSP ksp;<br>
PC pc;<br>
PetscInt i,j,col[N];<br>
PetscScalar value[N];<br>
PetscScalar val;<br>
<br>
PetscInitialize(PETSC_NULL,PETSC_NULL,PETSC_NULL,PETSC_NULL);<br>
VecCreate(PETSC_COMM_WORLD,&x);<br>
VecSetSizes(x,PETSC_DECIDE,N);<br>
VecSetFromOptions(x);<br>
VecDuplicate(x,&b);<br>
<br>
MatCreate(PETSC_COMM_WORLD,&A);<br>
MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,N,N);<br>
MatSetFromOptions(A);<br>
<br>
for (i=0; i<N; i++) {<br>
for (j=0;j<N;j++) {<br>
value[j]=mat[i*N+j];<br>
col[j]=j;<br>
}<br>
MatSetValues(A,1,&i,N,col,value,INSERT_VALUES);<br>
}<br>
<br>
MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY);<br>
MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY);<br>
<br>
for (i=0; i<N; i++) {<br>
val=vec[i];<br>
VecSetValues(b,1,&i,&val,INSERT_VALUES);<br>
val=sol[i];<br>
VecSetValues(x,1,&i,&val,INSERT_VALUES);<br>
}<br>
<br>
VecAssemblyBegin(b);<br>
VecAssemblyEnd(b);<br>
<br>
KSPCreate(PETSC_COMM_WORLD,&ksp);<br>
KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN);<br>
KSPGetPC(ksp,&pc);<br>
KSPSetType(ksp, KSPGMRES);<br>
PCSetType(pc, PCBJACOBI);<br>
<br>
KSPSetTolerances(ksp,tol,PETSC_DEFAULT,PETSC_DEFAULT,PETSC_DEFAULT);<br>
KSPSetFromOptions(ksp);<br>
KSPSolve(ksp,b,x);<br>
<br>
for (i=0;i<N;i++) {<br>
VecGetValues(x,1,&i,&sol[i]);<br>
}<br>
<br>
VecDestroy(x);<br>
VecDestroy(b);<br>
MatDestroy(A);<br>
KSPDestroy(ksp);<br>
PetscFinalize();<br>
}<br>
<br>
Once this function is call in a loop, it reports error.<br>
<br>
On Sun, Oct 11, 2009 at 9:38 PM, Yin Feng <<a href="mailto:yfeng1@tigers.lsu.edu" target="_blank">yfeng1@tigers.lsu.edu</a>> wrote:<br>
I put PETSc solver in one function and use another function to call that.<br>
This problem only appears when I put the function with PETSc solver in a loop,<br>
it works well at first step, and reports error "An error occurred in MPI_Comm_rank after MPI was finalized"<br>
at second time. The program is designed to support only one processor like:<br>
<br>
Vec x,b;<br>
Mat A;<br>
KSP ksp;<br>
PC pc;<br>
PetscInt i,j,col[N];<br>
PetscScalar value[N];<br>
PetscScalar val;<br>
<br>
PetscInitialize(PETSC_NULL,PETSC_NULL,PETSC_NULL,PETSC_NULL);<br>
VecCreate(PETSC_COMM_WORLD,&x);<br>
VecSetSizes(x,PETSC_DECIDE,N);<br>
VecSetFromOptions(x);<br>
VecDuplicate(x,&b);<br>
MatCreate(PETSC_COMM_WORLD,&A);<br>
MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,N,N);<br>
MatSetFromOptions(A);<br>
........<br>
........<br>
KSPCreate(PETSC_COMM_WORLD,&ksp);<br>
KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN);<br>
KSPGetPC(ksp,&pc);<br>
<br>
................<br>
...............<br>
KSPSetTolerances(ksp,tol,PETSC_DEFAULT,PETSC_DEFAULT,PETSC_DEFAULT);<br>
KSPSetFromOptions(ksp);<br>
KSPSolve(ksp,b,x);<br>
<br>
VecDestroy(x);<br>
VecDestroy(b);<br>
MatDestroy(A);<br>
KSPDestroy(ksp);<br>
PetscFinalize();<br>
<br>
Any one has ideal about this?<br>
The detailed error description is:<br>
An error occurred in MPI_Comm_rank<br>
*** after MPI was finalized<br>
*** MPI_ERRORS_ARE_FATAL (goodbye)<br>
Abort before MPI_INIT completed successfully; not able to guarantee that all other processes were killed!<br>
<br>
Thank you so much in advance!<br>
<br>
<br>
<br>
</blockquote>
<br>
</div></div></blockquote></div><br>