<div dir="ltr"><div>I'm comparing with the ones vector as in many examples from petsc docs, so this may be because i hadn't set up the output to a single processor, but i get the following output for 1,2,4 processors:</div><div><br></div><div>n=1</div><div><div><font face="monospace, monospace">TrivSoln loaded, size: 213120 / 213120</font></div><div><font face="monospace, monospace"> RHS loaded, size: 213120 / 213120</font></div><div><font face="monospace, monospace"><b> Norm: 7.21632103486563610E-011</b></font></div><div><font face="monospace, monospace"> Its: 101</font></div><div><font face="monospace, monospace"> Total time: 5.0112988948822021</font></div><div><font face="monospace, monospace"><br></font></div><div><font face="monospace, monospace">n=2</font></div><div><font face="monospace, monospace"> TrivSoln loaded, size: 213120 / 213120</font></div><div><font face="monospace, monospace"> TrivSoln loaded, size: 213120 / 213120</font></div><div><font face="monospace, monospace"> RHS loaded, size: 213120 / 213120</font></div><div><font face="monospace, monospace"> RHS loaded, size: 213120 / 213120</font></div><div><font face="monospace, monospace"><b> Norm: 1.09862436488003634E-007</b></font></div><div><font face="monospace, monospace"> Its: 101</font></div><div><font face="monospace, monospace"> Norm: 1.09862436488003634E-007</font></div><div><font face="monospace, monospace"> Its: 101</font></div><div><font face="monospace, monospace"> Total time: 2.9765341281890869</font></div><div><font face="monospace, monospace"> Total time: 2.9770300388336182</font></div><div><font face="monospace, monospace"><br></font></div><div><font face="monospace, monospace">n=4</font></div><div><font face="monospace, monospace"> TrivSoln loaded, size: 213120 / 213120</font></div><div><font face="monospace, monospace"> TrivSoln loaded, size: 213120 / 213120</font></div><div><font face="monospace, monospace"> TrivSoln loaded, size: 213120 / 213120</font></div><div><font face="monospace, monospace"> TrivSoln loaded, size: 213120 / 213120</font></div><div><font face="monospace, monospace"> RHS loaded, size: 213120 / 213120</font></div><div><font face="monospace, monospace"> RHS loaded, size: 213120 / 213120</font></div><div><font face="monospace, monospace"> RHS loaded, size: 213120 / 213120</font></div><div><font face="monospace, monospace"> RHS loaded, size: 213120 / 213120</font></div><div><font face="monospace, monospace"><b> Norm: 1.72790692829407788E-005</b></font></div><div><font face="monospace, monospace"> Its: 101</font></div><div><font face="monospace, monospace"> Norm: 1.72790692829407788E-005</font></div><div><font face="monospace, monospace"> Its: 101</font></div><div><font face="monospace, monospace"> Norm: 1.72790692829407788E-005</font></div><div><font face="monospace, monospace"> Its: 101</font></div><div><font face="monospace, monospace"> Norm: 1.72790692829407788E-005</font></div><div><font face="monospace, monospace"> Its: 101</font></div><div><font face="monospace, monospace"> Total time: 1.8007240295410156</font></div><div><font face="monospace, monospace"> Total time: 1.8008360862731934</font></div><div><font face="monospace, monospace"> Total time: 1.8008909225463867</font></div><div><font face="monospace, monospace"> Total time: 1.8009200096130371</font></div><div><br></div></div><div><br></div><div>That is the error norm from the ones vector, im attaching the script again.</div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Sat, Oct 1, 2016 at 8:59 AM, Barry Smith <span dir="ltr"><<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><span class=""><br>
> On Sep 30, 2016, at 9:13 PM, Manuel Valera <<a href="mailto:mvalera@mail.sdsu.edu">mvalera@mail.sdsu.edu</a>> wrote:<br>
><br>
> Hi Barry and all,<br>
><br>
> I was successful on creating the parallel version to solve my big system, it is scaling accordingly, but i noticed the error norm increasing too, i don't know if this is because the output is duplicated or if its really increasing. Is this expected ?<br>
<br>
</span> What do you mean by error norm? Do you have an exact solution you are comparing to? If so, you should scale the norm arising from this by 1/sqrt(nx*ny) where nx and ny are the number of grid points in the x and y direction. This scaling makes the norm correspond to the L2 norm of the error which is what you want to measure.<br>
<br>
With this new scaling you can do convergence studies, for example refine the grid once how much does the error norm reduce, refine the grid again and you should see a similar reduction in the error norm.<br>
<span class="HOEnZb"><font color="#888888"><br>
<br>
Barry<br>
</font></span><div class="HOEnZb"><div class="h5"><br>
><br>
> Thanks<br>
><br>
> On Tue, Sep 27, 2016 at 4:07 PM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br>
><br>
> Yes, always use the binary file<br>
><br>
> > On Sep 27, 2016, at 3:13 PM, Manuel Valera <<a href="mailto:mvalera@mail.sdsu.edu">mvalera@mail.sdsu.edu</a>> wrote:<br>
> ><br>
> > Barry, thanks for your insight,<br>
> ><br>
> > This standalone script must be translated into a much bigger model, which uses AIJ matrices to define the laplacian in the form of the 3 usual arrays, the ascii files in the script take the place of the arrays which are passed to the solving routine in the model.<br>
> ><br>
> > So, can i use the approach you mention to create the MPIAIJ from the petsc binary file ? would this be a better solution than reading the three arrays directly? In the model, even the smallest matrix is 10^5x10^5 elements<br>
> ><br>
> > Thanks.<br>
> ><br>
> ><br>
> > On Tue, Sep 27, 2016 at 12:53 PM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br>
> ><br>
> > Are you loading a matrix from an ASCII file? If so don't do that. You should write a simple sequential PETSc program that reads in the ASCII file and saves the matrix as a PETSc binary file with MatView(). Then write your parallel code that reads in the binary file with MatLoad() and solves the system. You can read in the right hand side from ASCII and save it in the binary file also. Trying to read an ASCII file in parallel and set it into a PETSc parallel matrix is just a totally thankless task that is unnecessary.<br>
> ><br>
> > Barry<br>
> ><br>
> > > On Sep 26, 2016, at 6:40 PM, Manuel Valera <<a href="mailto:mvalera@mail.sdsu.edu">mvalera@mail.sdsu.edu</a>> wrote:<br>
> > ><br>
> > > Ok, last output was from simulated multicores, in an actual cluster the errors are of the kind:<br>
> > ><br>
> > > [valera@cinci CSRMatrix]$ petsc -n 2 ./solvelinearmgPETSc<br>
> > > TrivSoln loaded, size: 4 / 4<br>
> > > TrivSoln loaded, size: 4 / 4<br>
> > > RHS loaded, size: 4 / 4<br>
> > > RHS loaded, size: 4 / 4<br>
> > > [0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> > > [0]PETSC ERROR: Argument out of range<br>
> > > [0]PETSC ERROR: Comm must be of size 1<br>
> > > [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
> > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016<br>
> > > [0]PETSC ERROR: ./solvelinearmgPETSc P on a arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016<br>
> > > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> > > [1]PETSC ERROR: Argument out of range<br>
> > > [1]PETSC ERROR: Comm must be of size 1<br>
> > > [1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
> > > [1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016<br>
> > > [1]PETSC ERROR: ./solvelinearmgPETSc P on a arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016<br>
> > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich<br>
> > > [1]PETSC ERROR: #1 MatCreate_SeqAIJ() line 3958 in /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/seq/aij.c<br>
> > > [1]PETSC ERROR: #2 MatSetType() line 94 in /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matreg.c<br>
> > > [1]PETSC ERROR: #3 MatCreateSeqAIJWithArrays() line 4300 in /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/seq/aij.c<br>
> > > local size: 2<br>
> > > local size: 2<br>
> > > Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich<br>
> > > [0]PETSC ERROR: #1 MatCreate_SeqAIJ() line 3958 in /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/seq/aij.c<br>
> > > [0]PETSC ERROR: #2 MatSetType() line 94 in /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matreg.c<br>
> > > [0]PETSC ERROR: #3 MatCreateSeqAIJWithArrays() line 4300 in /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/seq/aij.c<br>
> > > [0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> > > [1]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> > > [1]PETSC ERROR: [0]PETSC ERROR: Nonconforming object sizes<br>
> > > [0]PETSC ERROR: Sum of local lengths 8 does not equal global length 4, my local length 4<br>
> > > likely a call to VecSetSizes() or MatSetSizes() is wrong.<br>
> > > See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#split" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html#split</a><br>
> > > [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
> > > Nonconforming object sizes<br>
> > > [1]PETSC ERROR: Sum of local lengths 8 does not equal global length 4, my local length 4<br>
> > > likely a call to VecSetSizes() or MatSetSizes() is wrong.<br>
> > > See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#split" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html#split</a><br>
> > > [1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
> > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016<br>
> > > [0]PETSC ERROR: ./solvelinearmgPETSc P on a arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016<br>
> > > [1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016<br>
> > > [1]PETSC ERROR: ./solvelinearmgPETSc P on a arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016<br>
> > > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich<br>
> > > [0]PETSC ERROR: #4 PetscSplitOwnership() line 93 in /home/valera/petsc-3.7.2/src/<wbr>sys/utils/psplit.c<br>
> > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich<br>
> > > [1]PETSC ERROR: #4 PetscSplitOwnership() line 93 in /home/valera/petsc-3.7.2/src/<wbr>sys/utils/psplit.c<br>
> > > [0]PETSC ERROR: #5 PetscLayoutSetUp() line 143 in /home/valera/petsc-3.7.2/src/<wbr>vec/is/utils/pmap.c<br>
> > > [0]PETSC ERROR: #6 MatMPIAIJSetPreallocation_<wbr>MPIAIJ() line 2768 in /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/mpi/mpiaij.c<br>
> > > [1]PETSC ERROR: #5 PetscLayoutSetUp() line 143 in /home/valera/petsc-3.7.2/src/<wbr>vec/is/utils/pmap.c<br>
> > > [1]PETSC ERROR: [0]PETSC ERROR: #7 MatMPIAIJSetPreallocation() line 3505 in /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/mpi/mpiaij.c<br>
> > > #6 MatMPIAIJSetPreallocation_<wbr>MPIAIJ() line 2768 in /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/mpi/mpiaij.c<br>
> > > [1]PETSC ERROR: [0]PETSC ERROR: #8 MatSetUp_MPIAIJ() line 2153 in /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/mpi/mpiaij.c<br>
> > > #7 MatMPIAIJSetPreallocation() line 3505 in /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/mpi/mpiaij.c<br>
> > > [1]PETSC ERROR: #8 MatSetUp_MPIAIJ() line 2153 in /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/mpi/mpiaij.c<br>
> > > [0]PETSC ERROR: #9 MatSetUp() line 739 in /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matrix.c<br>
> > > [1]PETSC ERROR: #9 MatSetUp() line 739 in /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matrix.c<br>
> > > [0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> > > [0]PETSC ERROR: Object is in wrong state<br>
> > > [0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on argument 1 "mat" before MatSetNearNullSpace()<br>
> > > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> > > [1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
> > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016<br>
> > > [0]PETSC ERROR: ./solvelinearmgPETSc P on a arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016<br>
> > > Object is in wrong state<br>
> > > [1]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on argument 1 "mat" before MatSetNearNullSpace()<br>
> > > [1]PETSC ERROR: [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich<br>
> > > [0]PETSC ERROR: #10 MatSetNearNullSpace() line 8195 in /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matrix.c<br>
> > > See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
> > > [1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016<br>
> > > [1]PETSC ERROR: ./solvelinearmgPETSc P on a arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016<br>
> > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich<br>
> > > [1]PETSC ERROR: #10 MatSetNearNullSpace() line 8195 in /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matrix.c<br>
> > > [0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> > > [0]PETSC ERROR: Object is in wrong state<br>
> > > [1]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> > > [0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on argument 1 "mat" before MatAssemblyBegin()<br>
> > > [0]PETSC ERROR: [1]PETSC ERROR: Object is in wrong state<br>
> > > [1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
> > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016<br>
> > > [0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on argument 1 "mat" before MatAssemblyBegin()<br>
> > > [1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
> > > [1]PETSC ERROR: ./solvelinearmgPETSc P on a arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016<br>
> > > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich<br>
> > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016<br>
> > > [1]PETSC ERROR: ./solvelinearmgPETSc P on a arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016<br>
> > > [1]PETSC ERROR: #11 MatAssemblyBegin() line 5093 in /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matrix.c<br>
> > > Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich<br>
> > > [1]PETSC ERROR: #11 MatAssemblyBegin() line 5093 in /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matrix.c<br>
> > > [0]PETSC ERROR: ------------------------------<wbr>------------------------------<wbr>------------<br>
> > > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range<br>
> > > [1]PETSC ERROR: ------------------------------<wbr>------------------------------<wbr>------------<br>
> > > [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range<br>
> > > [1]PETSC ERROR: [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>
> > > [0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html#<wbr>valgrind</a><br>
> > > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>
> > > [1]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html#<wbr>valgrind</a><br>
> > > [1]PETSC ERROR: or try <a href="http://valgrind.org" rel="noreferrer" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
> > > or try <a href="http://valgrind.org" rel="noreferrer" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
> > > [0]PETSC ERROR: likely location of problem given in stack below<br>
> > > [0]PETSC ERROR: --------------------- Stack Frames ------------------------------<wbr>------<br>
> > > [1]PETSC ERROR: likely location of problem given in stack below<br>
> > > [1]PETSC ERROR: --------------------- Stack Frames ------------------------------<wbr>------<br>
> > > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
> > > [0]PETSC ERROR: INSTEAD the line number of the start of the function<br>
> > > [0]PETSC ERROR: [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
> > > [1]PETSC ERROR: INSTEAD the line number of the start of the function<br>
> > > is given.<br>
> > > [0]PETSC ERROR: [0] MatAssemblyEnd line 5185 /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matrix.c<br>
> > > [0]PETSC ERROR: [1]PETSC ERROR: is given.<br>
> > > [1]PETSC ERROR: [1] MatAssemblyEnd line 5185 /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matrix.c<br>
> > > [0] MatAssemblyBegin line 5090 /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matrix.c<br>
> > > [0]PETSC ERROR: [0] MatSetNearNullSpace line 8191 /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matrix.c<br>
> > > [0]PETSC ERROR: [1]PETSC ERROR: [1] MatAssemblyBegin line 5090 /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matrix.c<br>
> > > [1]PETSC ERROR: [0] PetscSplitOwnership line 80 /home/valera/petsc-3.7.2/src/<wbr>sys/utils/psplit.c<br>
> > > [0]PETSC ERROR: [0] PetscLayoutSetUp line 129 /home/valera/petsc-3.7.2/src/<wbr>vec/is/utils/pmap.c<br>
> > > [0]PETSC ERROR: [0] MatMPIAIJSetPreallocation_<wbr>MPIAIJ line 2767 /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/mpi/mpiaij.c<br>
> > > [1] MatSetNearNullSpace line 8191 /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matrix.c<br>
> > > [1]PETSC ERROR: [1] PetscSplitOwnership line 80 /home/valera/petsc-3.7.2/src/<wbr>sys/utils/psplit.c<br>
> > > [1]PETSC ERROR: [0]PETSC ERROR: [0] MatMPIAIJSetPreallocation line 3502 /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/mpi/mpiaij.c<br>
> > > [0]PETSC ERROR: [0] MatSetUp_MPIAIJ line 2152 /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/mpi/mpiaij.c<br>
> > > [1] PetscLayoutSetUp line 129 /home/valera/petsc-3.7.2/src/<wbr>vec/is/utils/pmap.c<br>
> > > [1]PETSC ERROR: [1] MatMPIAIJSetPreallocation_<wbr>MPIAIJ line 2767 /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/mpi/mpiaij.c<br>
> > > [0]PETSC ERROR: [0] MatSetUp line 727 /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matrix.c<br>
> > > [0]PETSC ERROR: [0] MatCreate_SeqAIJ line 3956 /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/seq/aij.c<br>
> > > [1]PETSC ERROR: [1] MatMPIAIJSetPreallocation line 3502 /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/mpi/mpiaij.c<br>
> > > [1]PETSC ERROR: [1] MatSetUp_MPIAIJ line 2152 /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/mpi/mpiaij.c<br>
> > > [0]PETSC ERROR: [0] MatSetType line 44 /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matreg.c<br>
> > > [0]PETSC ERROR: [0] MatCreateSeqAIJWithArrays line 4295 /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/seq/aij.c<br>
> > > [1]PETSC ERROR: [1] MatSetUp line 727 /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matrix.c<br>
> > > [1]PETSC ERROR: [1] MatCreate_SeqAIJ line 3956 /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/seq/aij.c<br>
> > > [0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> > > [0]PETSC ERROR: Signal received<br>
> > > [1]PETSC ERROR: [1] MatSetType line 44 /home/valera/petsc-3.7.2/src/<wbr>mat/interface/matreg.c<br>
> > > [1]PETSC ERROR: [1] MatCreateSeqAIJWithArrays line 4295 /home/valera/petsc-3.7.2/src/<wbr>mat/impls/aij/seq/aij.c<br>
> > > [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
> > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016<br>
> > > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> > > [1]PETSC ERROR: ./solvelinearmgPETSc P on a arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016<br>
> > > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich<br>
> > > [0]PETSC ERROR: Signal received<br>
> > > [1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
> > > [1]PETSC ERROR: #12 User provided function() line 0 in unknown file<br>
> > > Petsc Release Version 3.7.2, Jun, 05, 2016<br>
> > > [1]PETSC ERROR: ./solvelinearmgPETSc P on a arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016<br>
> > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich<br>
> > > [1]PETSC ERROR: #12 User provided function() line 0 in unknown file<br>
> > > application called MPI_Abort(comm=0x84000004, 59) - process 0<br>
> > > [cli_0]: aborting job:<br>
> > > application called MPI_Abort(comm=0x84000004, 59) - process 0<br>
> > > application called MPI_Abort(comm=0x84000002, 59) - process 1<br>
> > > [cli_1]: aborting job:<br>
> > > application called MPI_Abort(comm=0x84000002, 59) - process 1<br>
> > ><br>
> > > ==============================<wbr>==============================<wbr>=======================<br>
> > > = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES<br>
> > > = PID 10266 RUNNING AT cinci<br>
> > > = EXIT CODE: 59<br>
> > > = CLEANING UP REMAINING PROCESSES<br>
> > > = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES<br>
> > > ==============================<wbr>==============================<wbr>=======================<br>
> > ><br>
> > ><br>
> > > On Mon, Sep 26, 2016 at 3:51 PM, Manuel Valera <<a href="mailto:mvalera@mail.sdsu.edu">mvalera@mail.sdsu.edu</a>> wrote:<br>
> > > Ok, i created a tiny testcase just for this,<br>
> > ><br>
> > > The output from n# calls are as follows:<br>
> > ><br>
> > > n1:<br>
> > > Mat Object: 1 MPI processes<br>
> > > type: mpiaij<br>
> > > row 0: (0, 1.) (1, 2.) (2, 4.) (3, 3.)<br>
> > > row 1: (0, 2.) (1, 1.) (2, 3.) (3, 4.)<br>
> > > row 2: (0, 4.) (1, 3.) (2, 1.) (3, 2.)<br>
> > > row 3: (0, 3.) (1, 4.) (2, 2.) (3, 1.)<br>
> > ><br>
> > > n2:<br>
> > > Mat Object: 2 MPI processes<br>
> > > type: mpiaij<br>
> > > row 0: (0, 1.) (1, 2.) (2, 4.) (3, 3.)<br>
> > > row 1: (0, 2.) (1, 1.) (2, 3.) (3, 4.)<br>
> > > row 2: (0, 1.) (1, 2.) (2, 4.) (3, 3.)<br>
> > > row 3: (0, 2.) (1, 1.) (2, 3.) (3, 4.)<br>
> > ><br>
> > > n4:<br>
> > > Mat Object: 4 MPI processes<br>
> > > type: mpiaij<br>
> > > row 0: (0, 1.) (1, 2.) (2, 4.) (3, 3.)<br>
> > > row 1: (0, 1.) (1, 2.) (2, 4.) (3, 3.)<br>
> > > row 2: (0, 1.) (1, 2.) (2, 4.) (3, 3.)<br>
> > > row 3: (0, 1.) (1, 2.) (2, 4.) (3, 3.)<br>
> > ><br>
> > ><br>
> > ><br>
> > > It really gets messed, no idea what's happening.<br>
> > ><br>
> > ><br>
> > ><br>
> > ><br>
> > > On Mon, Sep 26, 2016 at 3:12 PM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br>
> > ><br>
> > > > On Sep 26, 2016, at 5:07 PM, Manuel Valera <<a href="mailto:mvalera@mail.sdsu.edu">mvalera@mail.sdsu.edu</a>> wrote:<br>
> > > ><br>
> > > > Ok i was using a big matrix before, from a smaller testcase i got the output and effectively, it looks like is not well read at all, results are attached for DRAW viewer, output is too big to use STDOUT even in the small testcase. n# is the number of processors requested.<br>
> > ><br>
> > > You need to construct a very small test case so you can determine why the values do not end up where you expect them. There is no way around it.<br>
> > > ><br>
> > > > is there a way to create the matrix in one node and the distribute it as needed on the rest ? maybe that would work.<br>
> > ><br>
> > > No the is not scalable. You become limited by the memory of the one node.<br>
> > ><br>
> > > ><br>
> > > > Thanks<br>
> > > ><br>
> > > > On Mon, Sep 26, 2016 at 2:40 PM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br>
> > > ><br>
> > > > How large is the matrix? It will take a very long time if the matrix is large. Debug with a very small matrix.<br>
> > > ><br>
> > > > Barry<br>
> > > ><br>
> > > > > On Sep 26, 2016, at 4:34 PM, Manuel Valera <<a href="mailto:mvalera@mail.sdsu.edu">mvalera@mail.sdsu.edu</a>> wrote:<br>
> > > > ><br>
> > > > > Indeed there is something wrong with that call, it hangs out indefinitely showing only:<br>
> > > > ><br>
> > > > > Mat Object: 1 MPI processes<br>
> > > > > type: mpiaij<br>
> > > > ><br>
> > > > > It draws my attention that this program works for 1 processor but not more, but it doesnt show anything for that viewer in either case.<br>
> > > > ><br>
> > > > > Thanks for the insight on the redundant calls, this is not very clear on documentation, which calls are included in others.<br>
> > > > ><br>
> > > > ><br>
> > > > ><br>
> > > > > On Mon, Sep 26, 2016 at 2:02 PM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br>
> > > > ><br>
> > > > > The call to MatCreateMPIAIJWithArrays() is likely interpreting the values you pass in different than you expect.<br>
> > > > ><br>
> > > > > Put a call to MatView(Ap,PETSC_VIEWER_<wbr>STDOUT_WORLD,ierr) after the MatCreateMPIAIJWithArray() to see what PETSc thinks the matrix is.<br>
> > > > ><br>
> > > > ><br>
> > > > > > On Sep 26, 2016, at 3:42 PM, Manuel Valera <<a href="mailto:mvalera@mail.sdsu.edu">mvalera@mail.sdsu.edu</a>> wrote:<br>
> > > > > ><br>
> > > > > > Hello,<br>
> > > > > ><br>
> > > > > > I'm working on solve a linear system in parallel, following ex12 of the ksp tutorial i don't see major complication on doing so, so for a working linear system solver with PCJACOBI and KSPGCR i did only the following changes:<br>
> > > > > ><br>
> > > > > > call MatCreate(PETSC_COMM_WORLD,Ap,<wbr>ierr)<br>
> > > > > > ! call MatSetType(Ap,MATSEQAIJ,ierr)<br>
> > > > > > call MatSetType(Ap,MATMPIAIJ,ierr) !paralellization<br>
> > > > > ><br>
> > > > > > call MatSetSizes(Ap,PETSC_DECIDE,<wbr>PETSC_DECIDE,nbdp,nbdp,ierr);<br>
> > > > > ><br>
> > > > > > ! call MatSeqAIJSetPreallocationCSR(<wbr>Ap,iapi,japi,app,ierr)<br>
> > > > > > call MatSetFromOptions(Ap,ierr)<br>
> > > > ><br>
> > > > > Note that none of the lines above are needed (or do anything) because the MatCreateMPIAIJWithArrays() creates the matrix from scratch itself.<br>
> > > > ><br>
> > > > > Barry<br>
> > > > ><br>
> > > > > > ! call MatCreateSeqAIJWithArrays(<wbr>PETSC_COMM_WORLD,nbdp,nbdp,<wbr>iapi,japi,app,Ap,ierr)<br>
> > > > > > call MatCreateMPIAIJWithArrays(<wbr>PETSC_COMM_WORLD,floor(real(<wbr>nbdp)/sizel),PETSC_DECIDE,<wbr>nbdp,nbdp,iapi,japi,app,Ap,<wbr>ierr)<br>
> > > > > ><br>
> > > > > ><br>
> > > > > > I grayed out the changes from sequential implementation.<br>
> > > > > ><br>
> > > > > > So, it does not complain at runtime until it reaches KSPSolve(), with the following error:<br>
> > > > > ><br>
> > > > > ><br>
> > > > > > [1]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--<br>
> > > > > > [1]PETSC ERROR: Object is in wrong state<br>
> > > > > > [1]PETSC ERROR: Matrix is missing diagonal entry 0<br>
> > > > > > [1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html</a> for trouble shooting.<br>
> > > > > > [1]PETSC ERROR: Petsc Release Version 3.7.3, unknown<br>
> > > > > > [1]PETSC ERROR: ./solvelinearmgPETSc � � on a arch-linux2-c-debug named valera-HP-xw4600-Workstation by valera Mon Sep 26 13:35:15 2016<br>
> > > > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich=1 --download-ml=1<br>
> > > > > > [1]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1733 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/mat/impls/aij/seq/<wbr>aijfact.c<br>
> > > > > > [1]PETSC ERROR: #2 MatILUFactorSymbolic() line 6579 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/mat/interface/<wbr>matrix.c<br>
> > > > > > [1]PETSC ERROR: #3 PCSetUp_ILU() line 212 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/ksp/pc/impls/factor/<wbr>ilu/ilu.c<br>
> > > > > > [1]PETSC ERROR: #4 PCSetUp() line 968 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/ksp/pc/interface/<wbr>precon.c<br>
> > > > > > [1]PETSC ERROR: #5 KSPSetUp() line 390 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/ksp/ksp/interface/<wbr>itfunc.c<br>
> > > > > > [1]PETSC ERROR: #6 PCSetUpOnBlocks_BJacobi_<wbr>Singleblock() line 650 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/ksp/pc/impls/<wbr>bjacobi/bjacobi.c<br>
> > > > > > [1]PETSC ERROR: #7 PCSetUpOnBlocks() line 1001 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/ksp/pc/interface/<wbr>precon.c<br>
> > > > > > [1]PETSC ERROR: #8 KSPSetUpOnBlocks() line 220 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/ksp/ksp/interface/<wbr>itfunc.c<br>
> > > > > > [1]PETSC ERROR: #9 KSPSolve() line 600 in /home/valera/v5PETSc/petsc/<wbr>petsc/src/ksp/ksp/interface/<wbr>itfunc.c<br>
> > > > > > At line 333 of file solvelinearmgPETSc.f90<br>
> > > > > > Fortran runtime error: Array bound mismatch for dimension 1 of array 'sol' (213120/106560)<br>
> > > > > ><br>
> > > > > ><br>
> > > > > > This code works for -n 1 cores, but it gives this error when using more than one core.<br>
> > > > > ><br>
> > > > > > What am i missing?<br>
> > > > > ><br>
> > > > > > Regards,<br>
> > > > > ><br>
> > > > > > Manuel.<br>
> > > > > ><br>
> > > > > > <solvelinearmgPETSc.f90><br>
> > > > ><br>
> > > > ><br>
> > > ><br>
> > > ><br>
> > > > <n4.png><n2.png><n1.png><br>
> > ><br>
> > ><br>
> > ><br>
> > > <rhss.txt><solvelinearmgPETSc.<wbr>f90><as.txt><ias.txt><jas.txt><br>
> ><br>
> ><br>
><br>
><br>
<br>
</div></div></blockquote></div><br></div>