[petsc-users] Debugging KSP output error

TAY wee-beng zonexo at gmail.com
Thu Aug 20 04:01:33 CDT 2015


On 20/8/2015 3:29 PM, Dave May wrote:
>
>
> On 20 August 2015 at 05:28, TAY wee-beng <zonexo at gmail.com 
> <mailto:zonexo at gmail.com>> wrote:
>
>     Hi,
>
>     I run my code on 1, 2 and 3 procs. KSP is used to solve the
>     Poisson eqn.
>
>     Using MatView and VecView, I found that my LHS matrix and RHS vec
>     are the same for 1,2 and 3 procs.
>
>     However, my pressure (ans) output is the almost the same (due to
>     truncation err) for 1,2 procs.
>
>     But for 3 procs, the output is the same as for the 1,2 procs for
>     all values except:
>
>     1. the last few values for procs 0
>
>     2. the first and last few values for procs 1 and 2.
>
>     Shouldn't the output be the same when the LHS matrix and RHS vec
>     are the same? How can I debug to find the err?
>
>
> It's a bit hard to say much without knowing exactly what solver 
> configuration you actually ran and without seeing the difference in 
> the solution you are referring too.
>
> Some preconditioners have different behaviour in serial and parallel. 
> Thus, the convergence of the solver and the residual history (and thus 
> the answer) can look slightly different. This difference will become 
> smaller as you solve the system more accurately.
> Do you solve the system accurately? e.g. something like -ksp_rtol 1.0e-10
>
> To avoid the problem mentioned above, try using -pc_type jacobi. This 
> PC is the same in serial and parallel. Thus, if your A and b are 
> identical  on 1,2,3 procs, then the residuals and solution will also 
> be identical on 1,2,3 procs (upto machine precision).
>
Hi Dave,

I tried using jacobi and it's the same result. I found out that the 
error is actually due to mismatched size between DMDACreate3d and 
MatGetOwnershipRange.

Using

/*call 
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&*//*
*//*
*//*size_z,1,PETSC_DECIDE,PETSC_DECIDE,1,stencil_width,lx,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,da_w,ierr)*//*
*//*
*//*call 
DMDAGetCorners(da_u,start_ijk(1),start_ijk(2),start_ijk(3),width_ijk(1),width_ijk(2),width_ijk(3),ierr)*/

and

*/call 
MatCreateAIJ(MPI_COMM_WORLD,PETSC_DECIDE,PETSC_DECIDE,size_x*size_y*size_z,size_x*size_y*size_z,7,PETSC_NULL_INTEGER,7,PETSC_NULL_INTEGER,A_mat,ierr)/**/
/**/
/**/call MatGetOwnershipRange(A_mat,ijk_sta_p,ijk_end_p,ierr)/*

Is this possible? Or is there an error somewhere? It happens when using 
3 procs, instead of 1 or 2.

For my size_x,size_y,size_z = 4,8,10, it was partitioned along z 
direction with 1->4, 5->7, 8->10 using 3 procs with DMDACreate3d which 
should give ownership (with Fortran index + 1) of:

myid,ijk_sta_p,ijk_end_p           1         129         192
  myid,ijk_sta_p,ijk_end_p           0           1         128
  myid,ijk_sta_p,ijk_end_p           2         193         320

But with MatGetOwnershipRange, I got

myid,ijk_sta_p,ijk_end_p           1         108         214
  myid,ijk_sta_p,ijk_end_p           0           1         107
  myid,ijk_sta_p,ijk_end_p           2         215         320

> Thanks,
>   Dave
>
>
>
>     -- 
>     Thank you
>
>     Yours sincerely,
>
>     TAY wee-beng
>
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150820/71a42cf5/attachment-0001.html>


More information about the petsc-users mailing list