[petsc-users] Solve KSP in parallel.
Manuel Valera
mvalera at mail.sdsu.edu
Mon Sep 26 16:34:05 CDT 2016
Indeed there is something wrong with that call, it hangs out indefinitely
showing only:
Mat Object: 1 MPI processes
type: mpiaij
It draws my attention that this program works for 1 processor but not more,
but it doesnt show anything for that viewer in either case.
Thanks for the insight on the redundant calls, this is not very clear on
documentation, which calls are included in others.
On Mon, Sep 26, 2016 at 2:02 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> The call to MatCreateMPIAIJWithArrays() is likely interpreting the
> values you pass in different than you expect.
>
> Put a call to MatView(Ap,PETSC_VIEWER_STDOUT_WORLD,ierr) after the
> MatCreateMPIAIJWithArray() to see what PETSc thinks the matrix is.
>
>
> > On Sep 26, 2016, at 3:42 PM, Manuel Valera <mvalera at mail.sdsu.edu>
> wrote:
> >
> > Hello,
> >
> > I'm working on solve a linear system in parallel, following ex12 of the
> ksp tutorial i don't see major complication on doing so, so for a working
> linear system solver with PCJACOBI and KSPGCR i did only the following
> changes:
> >
> > call MatCreate(PETSC_COMM_WORLD,Ap,ierr)
> > ! call MatSetType(Ap,MATSEQAIJ,ierr)
> > call MatSetType(Ap,MATMPIAIJ,ierr) !paralellization
> >
> > call MatSetSizes(Ap,PETSC_DECIDE,PETSC_DECIDE,nbdp,nbdp,ierr);
> >
> > ! call MatSeqAIJSetPreallocationCSR(Ap,iapi,japi,app,ierr)
> > call MatSetFromOptions(Ap,ierr)
>
> Note that none of the lines above are needed (or do anything) because
> the MatCreateMPIAIJWithArrays() creates the matrix from scratch itself.
>
> Barry
>
> > ! call MatCreateSeqAIJWithArrays(PETSC_COMM_WORLD,nbdp,nbdp,
> iapi,japi,app,Ap,ierr)
> > call MatCreateMPIAIJWithArrays(PETSC_COMM_WORLD,floor(real(
> nbdp)/sizel),PETSC_DECIDE,nbdp,nbdp,iapi,japi,app,Ap,ierr)
> >
> >
> > I grayed out the changes from sequential implementation.
> >
> > So, it does not complain at runtime until it reaches KSPSolve(), with
> the following error:
> >
> >
> > [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [1]PETSC ERROR: Object is in wrong state
> > [1]PETSC ERROR: Matrix is missing diagonal entry 0
> > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [1]PETSC ERROR: Petsc Release Version 3.7.3, unknown
> > [1]PETSC ERROR: ./solvelinearmgPETSc
>
>
> � � on a
> arch-linux2-c-debug named valera-HP-xw4600-Workstation by valera Mon Sep 26
> 13:35:15 2016
> > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --download-fblaslapack=1 --download-mpich=1
> --download-ml=1
> > [1]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1733 in
> /home/valera/v5PETSc/petsc/petsc/src/mat/impls/aij/seq/aijfact.c
> > [1]PETSC ERROR: #2 MatILUFactorSymbolic() line 6579 in
> /home/valera/v5PETSc/petsc/petsc/src/mat/interface/matrix.c
> > [1]PETSC ERROR: #3 PCSetUp_ILU() line 212 in /home/valera/v5PETSc/petsc/
> petsc/src/ksp/pc/impls/factor/ilu/ilu.c
> > [1]PETSC ERROR: #4 PCSetUp() line 968 in /home/valera/v5PETSc/petsc/
> petsc/src/ksp/pc/interface/precon.c
> > [1]PETSC ERROR: #5 KSPSetUp() line 390 in /home/valera/v5PETSc/petsc/
> petsc/src/ksp/ksp/interface/itfunc.c
> > [1]PETSC ERROR: #6 PCSetUpOnBlocks_BJacobi_Singleblock() line 650 in
> /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/impls/bjacobi/bjacobi.c
> > [1]PETSC ERROR: #7 PCSetUpOnBlocks() line 1001 in
> /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/interface/precon.c
> > [1]PETSC ERROR: #8 KSPSetUpOnBlocks() line 220 in
> /home/valera/v5PETSc/petsc/petsc/src/ksp/ksp/interface/itfunc.c
> > [1]PETSC ERROR: #9 KSPSolve() line 600 in /home/valera/v5PETSc/petsc/
> petsc/src/ksp/ksp/interface/itfunc.c
> > At line 333 of file solvelinearmgPETSc.f90
> > Fortran runtime error: Array bound mismatch for dimension 1 of array
> 'sol' (213120/106560)
> >
> >
> > This code works for -n 1 cores, but it gives this error when using more
> than one core.
> >
> > What am i missing?
> >
> > Regards,
> >
> > Manuel.
> >
> > <solvelinearmgPETSc.f90>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160926/27477c34/attachment-0001.html>
More information about the petsc-users
mailing list