<div dir="ltr">Indeed there is something wrong with that call, it hangs out indefinitely showing only:<div><br></div><div><font face="monospace, monospace"> Mat Object: 1 MPI processes</font><div><font face="monospace, monospace"> type: mpiaij</font></div><div><br></div></div><div>It draws my attention that this program works for 1 processor but not more, but it doesnt show anything for that viewer in either case.</div><div><br></div><div>Thanks for the insight on the redundant calls, this is not very clear on documentation, which calls are included in others.</div><div><br></div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Mon, Sep 26, 2016 at 2:02 PM, Barry Smith <span dir="ltr"><<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
The call to MatCreateMPIAIJWithArrays() is likely interpreting the values you pass in different than you expect.<br>
<br>
Put a call to MatView(Ap,PETSC_VIEWER_<wbr>STDOUT_WORLD,ierr) after the MatCreateMPIAIJWithArray() to see what PETSc thinks the matrix is.<br>
<span class=""><br>
<br>
> On Sep 26, 2016, at 3:42 PM, Manuel Valera <<a href="mailto:mvalera@mail.sdsu.edu">mvalera@mail.sdsu.edu</a>> wrote:<br>
><br>
> Hello,<br>
><br>
> I'm working on solve a linear system in parallel, following ex12 of the ksp tutorial i don't see major complication on doing so, so for a working linear system solver with PCJACOBI and KSPGCR i did only the following changes:<br>
><br>
> call MatCreate(PETSC_COMM_WORLD,Ap,<wbr>ierr)<br>
> ! call MatSetType(Ap,MATSEQAIJ,ierr)<br>
> call MatSetType(Ap,MATMPIAIJ,ierr) !paralellization<br>
><br>
> call MatSetSizes(Ap,PETSC_DECIDE,<wbr>PETSC_DECIDE,nbdp,nbdp,ierr);<br>
><br>
> ! call MatSeqAIJSetPreallocationCSR(<wbr>Ap,iapi,japi,app,ierr)<br>
> call MatSetFromOptions(Ap,ierr)<br>
<br>
</span> Note that none of the lines above are needed (or do anything) because the MatCreateMPIAIJWithArrays() creates the matrix from scratch itself.<br>
<br>
Barry<br>
<div><div class="h5"><br>
> ! call MatCreateSeqAIJWithArrays(<wbr>PETSC_COMM_WORLD,nbdp,nbdp,<wbr>iapi,japi,app,Ap,ierr)<br>
> call MatCreateMPIAIJWithArrays(<wbr>PETSC_COMM_WORLD,floor(real(<wbr>nbdp)/sizel),PETSC_DECIDE,<wbr>nbdp,nbdp,iapi,japi,app,Ap,<wbr>ierr)<br>
><br>
><br>
> I grayed out the changes from sequential implementation.<br>
><br>
> So, it does not complain at runtime until it reaches KSPSolve(), with the following error:<br>
><br>
><br>
> [1]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> [1]PETSC ERROR: Object is in wrong state<br>
> [1]PETSC ERROR: Matrix is missing diagonal entry 0<br>
> [1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
> [1]PETSC ERROR: Petsc Release Version 3.7.3, unknown<br>
> [1]PETSC ERROR: ./solvelinearmgPETSc � � on a arch-linux2-c-debug named valera-HP-xw4600-Workstation by valera Mon Sep 26 13:35:15 2016<br>
> [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich=1 --download-ml=1<br>
> [1]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1733 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/mat/impls/aij/seq/<wbr>aijfact.c<br>
> [1]PETSC ERROR: #2 MatILUFactorSymbolic() line 6579 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/mat/interface/<wbr>matrix.c<br>
> [1]PETSC ERROR: #3 PCSetUp_ILU() line 212 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/ksp/pc/impls/factor/<wbr>ilu/ilu.c<br>
> [1]PETSC ERROR: #4 PCSetUp() line 968 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/ksp/pc/interface/<wbr>precon.c<br>
> [1]PETSC ERROR: #5 KSPSetUp() line 390 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/ksp/ksp/interface/<wbr>itfunc.c<br>
> [1]PETSC ERROR: #6 PCSetUpOnBlocks_BJacobi_<wbr>Singleblock() line 650 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/ksp/pc/impls/<wbr>bjacobi/bjacobi.c<br>
> [1]PETSC ERROR: #7 PCSetUpOnBlocks() line 1001 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/ksp/pc/interface/<wbr>precon.c<br>
> [1]PETSC ERROR: #8 KSPSetUpOnBlocks() line 220 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/ksp/ksp/interface/<wbr>itfunc.c<br>
> [1]PETSC ERROR: #9 KSPSolve() line 600 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/ksp/ksp/interface/<wbr>itfunc.c<br>
> At line 333 of file solvelinearmgPETSc.f90<br>
> Fortran runtime error: Array bound mismatch for dimension 1 of array 'sol' (213120/106560)<br>
><br>
><br>
> This code works for -n 1 cores, but it gives this error when using more than one core.<br>
><br>
> What am i missing?<br>
><br>
> Regards,<br>
><br>
> Manuel.<br>
><br>
</div></div>> <solvelinearmgPETSc.f90><br>
<br>
</blockquote></div><br></div>