[petsc-users] Solve KSP in parallel.

Manuel Valera mvalera at mail.sdsu.edu
Mon Sep 26 18:40:21 CDT 2016


Ok, last output was from simulated multicores, in an actual cluster the
errors are of the kind:

[valera at cinci CSRMatrix]$ petsc -n 2 ./solvelinearmgPETSc
 TrivSoln loaded, size:            4 /           4
 TrivSoln loaded, size:            4 /           4
 RHS loaded, size:            4 /           4
 RHS loaded, size:            4 /           4
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Argument out of range
[0]PETSC ERROR: Comm must be of size 1
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
[0]PETSC ERROR: ./solvelinearmgPETSc


                                             P on a arch-linux2-c-debug
named cinci by valera Mon Sep 26 16:39:02 2016
[0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[1]PETSC ERROR: Argument out of range
[1]PETSC ERROR: Comm must be of size 1
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
[1]PETSC ERROR: ./solvelinearmgPETSc


                                             P on a arch-linux2-c-debug
named cinci by valera Mon Sep 26 16:39:02 2016
[1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --download-fblaslapack=1 --download-mpich
[1]PETSC ERROR: #1 MatCreate_SeqAIJ() line 3958 in
/home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
[1]PETSC ERROR: #2 MatSetType() line 94 in
/home/valera/petsc-3.7.2/src/mat/interface/matreg.c
[1]PETSC ERROR: #3 MatCreateSeqAIJWithArrays() line 4300 in
/home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
 local size:           2
 local size:           2
Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran
--download-fblaslapack=1 --download-mpich
[0]PETSC ERROR: #1 MatCreate_SeqAIJ() line 3958 in
/home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
[0]PETSC ERROR: #2 MatSetType() line 94 in
/home/valera/petsc-3.7.2/src/mat/interface/matreg.c
[0]PETSC ERROR: #3 MatCreateSeqAIJWithArrays() line 4300 in
/home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[1]PETSC ERROR: [0]PETSC ERROR: Nonconforming object sizes
[0]PETSC ERROR: Sum of local lengths 8 does not equal global length 4, my
local length 4
  likely a call to VecSetSizes() or MatSetSizes() is wrong.
See http://www.mcs.anl.gov/petsc/documentation/faq.html#split
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
Nonconforming object sizes
[1]PETSC ERROR: Sum of local lengths 8 does not equal global length 4, my
local length 4
  likely a call to VecSetSizes() or MatSetSizes() is wrong.
See http://www.mcs.anl.gov/petsc/documentation/faq.html#split
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
[0]PETSC ERROR: ./solvelinearmgPETSc


                                             P on a arch-linux2-c-debug
named cinci by valera Mon Sep 26 16:39:02 2016
[1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
[1]PETSC ERROR: ./solvelinearmgPETSc


                                             P on a arch-linux2-c-debug
named cinci by valera Mon Sep 26 16:39:02 2016
[0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --download-fblaslapack=1 --download-mpich
[0]PETSC ERROR: #4 PetscSplitOwnership() line 93 in
/home/valera/petsc-3.7.2/src/sys/utils/psplit.c
[1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --download-fblaslapack=1 --download-mpich
[1]PETSC ERROR: #4 PetscSplitOwnership() line 93 in
/home/valera/petsc-3.7.2/src/sys/utils/psplit.c
[0]PETSC ERROR: #5 PetscLayoutSetUp() line 143 in
/home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c
[0]PETSC ERROR: #6 MatMPIAIJSetPreallocation_MPIAIJ() line 2768 in
/home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
[1]PETSC ERROR: #5 PetscLayoutSetUp() line 143 in
/home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c
[1]PETSC ERROR: [0]PETSC ERROR: #7 MatMPIAIJSetPreallocation() line 3505 in
/home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
#6 MatMPIAIJSetPreallocation_MPIAIJ() line 2768 in
/home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
[1]PETSC ERROR: [0]PETSC ERROR: #8 MatSetUp_MPIAIJ() line 2153 in
/home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
#7 MatMPIAIJSetPreallocation() line 3505 in
/home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
[1]PETSC ERROR: #8 MatSetUp_MPIAIJ() line 2153 in
/home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
[0]PETSC ERROR: #9 MatSetUp() line 739 in
/home/valera/petsc-3.7.2/src/mat/interface/matrix.c
[1]PETSC ERROR: #9 MatSetUp() line 739 in
/home/valera/petsc-3.7.2/src/mat/interface/matrix.c
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on
argument 1 "mat" before MatSetNearNullSpace()
[0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
[0]PETSC ERROR: ./solvelinearmgPETSc


                                             P on a arch-linux2-c-debug
named cinci by valera Mon Sep 26 16:39:02 2016
Object is in wrong state
[1]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on
argument 1 "mat" before MatSetNearNullSpace()
[1]PETSC ERROR: [0]PETSC ERROR: Configure options --with-cc=gcc
--with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich
[0]PETSC ERROR: #10 MatSetNearNullSpace() line 8195 in
/home/valera/petsc-3.7.2/src/mat/interface/matrix.c
See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
shooting.
[1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
[1]PETSC ERROR: ./solvelinearmgPETSc


                                             P on a arch-linux2-c-debug
named cinci by valera Mon Sep 26 16:39:02 2016
[1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --download-fblaslapack=1 --download-mpich
[1]PETSC ERROR: #10 MatSetNearNullSpace() line 8195 in
/home/valera/petsc-3.7.2/src/mat/interface/matrix.c
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Object is in wrong state
[1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on
argument 1 "mat" before MatAssemblyBegin()
[0]PETSC ERROR: [1]PETSC ERROR: Object is in wrong state
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
[0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on
argument 1 "mat" before MatAssemblyBegin()
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[1]PETSC ERROR: ./solvelinearmgPETSc


                                             P on a arch-linux2-c-debug
named cinci by valera Mon Sep 26 16:39:02 2016
[0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --download-fblaslapack=1 --download-mpich
[0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
[1]PETSC ERROR: ./solvelinearmgPETSc


                                             P on a arch-linux2-c-debug
named cinci by valera Mon Sep 26 16:39:02 2016
[1]PETSC ERROR: #11 MatAssemblyBegin() line 5093 in
/home/valera/petsc-3.7.2/src/mat/interface/matrix.c
Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran
--download-fblaslapack=1 --download-mpich
[1]PETSC ERROR: #11 MatAssemblyBegin() line 5093 in
/home/valera/petsc-3.7.2/src/mat/interface/matrix.c
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[1]PETSC ERROR:
------------------------------------------------------------------------
[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[1]PETSC ERROR: [0]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger
[0]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[1]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
to find memory corruption errors
or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory
corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames
------------------------------------
[1]PETSC ERROR: likely location of problem given in stack below
[1]PETSC ERROR: ---------------------  Stack Frames
------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR: [1]PETSC ERROR: Note: The EXACT line numbers in the stack
are not available,
[1]PETSC ERROR:       INSTEAD the line number of the start of the function
      is given.
[0]PETSC ERROR: [0] MatAssemblyEnd line 5185
/home/valera/petsc-3.7.2/src/mat/interface/matrix.c
[0]PETSC ERROR: [1]PETSC ERROR:       is given.
[1]PETSC ERROR: [1] MatAssemblyEnd line 5185
/home/valera/petsc-3.7.2/src/mat/interface/matrix.c
[0] MatAssemblyBegin line 5090
/home/valera/petsc-3.7.2/src/mat/interface/matrix.c
[0]PETSC ERROR: [0] MatSetNearNullSpace line 8191
/home/valera/petsc-3.7.2/src/mat/interface/matrix.c
[0]PETSC ERROR: [1]PETSC ERROR: [1] MatAssemblyBegin line 5090
/home/valera/petsc-3.7.2/src/mat/interface/matrix.c
[1]PETSC ERROR: [0] PetscSplitOwnership line 80
/home/valera/petsc-3.7.2/src/sys/utils/psplit.c
[0]PETSC ERROR: [0] PetscLayoutSetUp line 129
/home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c
[0]PETSC ERROR: [0] MatMPIAIJSetPreallocation_MPIAIJ line 2767
/home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
[1] MatSetNearNullSpace line 8191
/home/valera/petsc-3.7.2/src/mat/interface/matrix.c
[1]PETSC ERROR: [1] PetscSplitOwnership line 80
/home/valera/petsc-3.7.2/src/sys/utils/psplit.c
[1]PETSC ERROR: [0]PETSC ERROR: [0] MatMPIAIJSetPreallocation line 3502
/home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
[0]PETSC ERROR: [0] MatSetUp_MPIAIJ line 2152
/home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
[1] PetscLayoutSetUp line 129
/home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c
[1]PETSC ERROR: [1] MatMPIAIJSetPreallocation_MPIAIJ line 2767
/home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
[0]PETSC ERROR: [0] MatSetUp line 727
/home/valera/petsc-3.7.2/src/mat/interface/matrix.c
[0]PETSC ERROR: [0] MatCreate_SeqAIJ line 3956
/home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
[1]PETSC ERROR: [1] MatMPIAIJSetPreallocation line 3502
/home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
[1]PETSC ERROR: [1] MatSetUp_MPIAIJ line 2152
/home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
[0]PETSC ERROR: [0] MatSetType line 44
/home/valera/petsc-3.7.2/src/mat/interface/matreg.c
[0]PETSC ERROR: [0] MatCreateSeqAIJWithArrays line 4295
/home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
[1]PETSC ERROR: [1] MatSetUp line 727
/home/valera/petsc-3.7.2/src/mat/interface/matrix.c
[1]PETSC ERROR: [1] MatCreate_SeqAIJ line 3956
/home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Signal received
[1]PETSC ERROR: [1] MatSetType line 44
/home/valera/petsc-3.7.2/src/mat/interface/matreg.c
[1]PETSC ERROR: [1] MatCreateSeqAIJWithArrays line 4295
/home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
[0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[1]PETSC ERROR: ./solvelinearmgPETSc


                                             P on a arch-linux2-c-debug
named cinci by valera Mon Sep 26 16:39:02 2016
[0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --download-fblaslapack=1 --download-mpich
[0]PETSC ERROR: Signal received
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[1]PETSC ERROR: #12 User provided function() line 0 in  unknown file
Petsc Release Version 3.7.2, Jun, 05, 2016
[1]PETSC ERROR: ./solvelinearmgPETSc


                                             P on a arch-linux2-c-debug
named cinci by valera Mon Sep 26 16:39:02 2016
[1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --download-fblaslapack=1 --download-mpich
[1]PETSC ERROR: #12 User provided function() line 0 in  unknown file
application called MPI_Abort(comm=0x84000004, 59) - process 0
[cli_0]: aborting job:
application called MPI_Abort(comm=0x84000004, 59) - process 0
application called MPI_Abort(comm=0x84000002, 59) - process 1
[cli_1]: aborting job:
application called MPI_Abort(comm=0x84000002, 59) - process 1

===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   PID 10266 RUNNING AT cinci
=   EXIT CODE: 59
=   CLEANING UP REMAINING PROCESSES
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================


On Mon, Sep 26, 2016 at 3:51 PM, Manuel Valera <mvalera at mail.sdsu.edu>
wrote:

> Ok, i created a tiny testcase just for this,
>
> The output from n# calls are as follows:
>
> n1:
> Mat Object: 1 MPI processes
>   type: mpiaij
> row 0: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> row 1: (0, 2.)  (1, 1.)  (2, 3.)  (3, 4.)
> row 2: (0, 4.)  (1, 3.)  (2, 1.)  (3, 2.)
> row 3: (0, 3.)  (1, 4.)  (2, 2.)  (3, 1.)
>
> n2:
> Mat Object: 2 MPI processes
>   type: mpiaij
> row 0: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> row 1: (0, 2.)  (1, 1.)  (2, 3.)  (3, 4.)
> row 2: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> row 3: (0, 2.)  (1, 1.)  (2, 3.)  (3, 4.)
>
> n4:
> Mat Object: 4 MPI processes
>   type: mpiaij
> row 0: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> row 1: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> row 2: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> row 3: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
>
>
>
> It really gets messed, no idea what's happening.
>
>
>
>
> On Mon, Sep 26, 2016 at 3:12 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>>
>> > On Sep 26, 2016, at 5:07 PM, Manuel Valera <mvalera at mail.sdsu.edu>
>> wrote:
>> >
>> > Ok i was using a big matrix before, from a smaller testcase i got the
>> output and effectively, it looks like is not well read at all, results are
>> attached for DRAW viewer, output is too big to use STDOUT even in the small
>> testcase. n# is the number of processors requested.
>>
>>    You need to construct a very small test case so you can determine why
>> the values do not end up where you expect them. There is no way around it.
>> >
>> > is there a way to create the matrix in one node and the distribute it
>> as needed on the rest ? maybe that would work.
>>
>>    No the is not scalable. You become limited by the memory of the one
>> node.
>>
>> >
>> > Thanks
>> >
>> > On Mon, Sep 26, 2016 at 2:40 PM, Barry Smith <bsmith at mcs.anl.gov>
>> wrote:
>> >
>> >     How large is the matrix? It will take a very long time if the
>> matrix is large. Debug with a very small matrix.
>> >
>> >   Barry
>> >
>> > > On Sep 26, 2016, at 4:34 PM, Manuel Valera <mvalera at mail.sdsu.edu>
>> wrote:
>> > >
>> > > Indeed there is something wrong with that call, it hangs out
>> indefinitely showing only:
>> > >
>> > >  Mat Object: 1 MPI processes
>> > >   type: mpiaij
>> > >
>> > > It draws my attention that this program works for 1 processor but not
>> more, but it doesnt show anything for that viewer in either case.
>> > >
>> > > Thanks for the insight on the redundant calls, this is not very clear
>> on documentation, which calls are included in others.
>> > >
>> > >
>> > >
>> > > On Mon, Sep 26, 2016 at 2:02 PM, Barry Smith <bsmith at mcs.anl.gov>
>> wrote:
>> > >
>> > >    The call to MatCreateMPIAIJWithArrays() is likely interpreting the
>> values you pass in different than you expect.
>> > >
>> > >     Put a call to MatView(Ap,PETSC_VIEWER_STDOUT_WORLD,ierr)  after
>> the MatCreateMPIAIJWithArray() to see what PETSc thinks the matrix is.
>> > >
>> > >
>> > > > On Sep 26, 2016, at 3:42 PM, Manuel Valera <mvalera at mail.sdsu.edu>
>> wrote:
>> > > >
>> > > > Hello,
>> > > >
>> > > > I'm working on solve a linear system in parallel, following ex12 of
>> the ksp tutorial i don't see major complication on doing so, so for a
>> working linear system solver with PCJACOBI and KSPGCR i did only the
>> following changes:
>> > > >
>> > > >    call MatCreate(PETSC_COMM_WORLD,Ap,ierr)
>> > > > !  call MatSetType(Ap,MATSEQAIJ,ierr)
>> > > >   call MatSetType(Ap,MATMPIAIJ,ierr) !paralellization
>> > > >
>> > > >   call MatSetSizes(Ap,PETSC_DECIDE,PETSC_DECIDE,nbdp,nbdp,ierr);
>> > > >
>> > > > !  call MatSeqAIJSetPreallocationCSR(Ap,iapi,japi,app,ierr)
>> > > >   call MatSetFromOptions(Ap,ierr)
>> > >
>> > >     Note that none of the lines above are needed (or do anything)
>> because the MatCreateMPIAIJWithArrays() creates the matrix from scratch
>> itself.
>> > >
>> > >    Barry
>> > >
>> > > > !  call MatCreateSeqAIJWithArrays(PETSC_COMM_WORLD,nbdp,nbdp,iapi,
>> japi,app,Ap,ierr)
>> > > >  call MatCreateMPIAIJWithArrays(PETSC_COMM_WORLD,floor(real(nbdp)/
>> sizel),PETSC_DECIDE,nbdp,nbdp,iapi,japi,app,Ap,ierr)
>> > > >
>> > > >
>> > > > I grayed out the changes from sequential implementation.
>> > > >
>> > > > So, it does not complain at runtime until it reaches KSPSolve(),
>> with the following error:
>> > > >
>> > > >
>> > > > [1]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------
>> > > > [1]PETSC ERROR: Object is in wrong state
>> > > > [1]PETSC ERROR: Matrix is missing diagonal entry 0
>> > > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/d
>> ocumentation/faq.html for trouble shooting.
>> > > > [1]PETSC ERROR: Petsc Release Version 3.7.3, unknown
>> > > > [1]PETSC ERROR: ./solvelinearmgPETSc
>>
>>
>>                                                     � � on a
>> arch-linux2-c-debug named valera-HP-xw4600-Workstation by valera Mon Sep 26
>> 13:35:15 2016
>> > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
>> --with-fc=gfortran --download-fblaslapack=1 --download-mpich=1
>> --download-ml=1
>> > > > [1]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1733 in
>> /home/valera/v5PETSc/petsc/petsc/src/mat/impls/aij/seq/aijfact.c
>> > > > [1]PETSC ERROR: #2 MatILUFactorSymbolic() line 6579 in
>> /home/valera/v5PETSc/petsc/petsc/src/mat/interface/matrix.c
>> > > > [1]PETSC ERROR: #3 PCSetUp_ILU() line 212 in
>> /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/impls/factor/ilu/ilu.c
>> > > > [1]PETSC ERROR: #4 PCSetUp() line 968 in
>> /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/interface/precon.c
>> > > > [1]PETSC ERROR: #5 KSPSetUp() line 390 in
>> /home/valera/v5PETSc/petsc/petsc/src/ksp/ksp/interface/itfunc.c
>> > > > [1]PETSC ERROR: #6 PCSetUpOnBlocks_BJacobi_Singleblock() line 650
>> in /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/impls/bjacobi/bjacobi.c
>> > > > [1]PETSC ERROR: #7 PCSetUpOnBlocks() line 1001 in
>> /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/interface/precon.c
>> > > > [1]PETSC ERROR: #8 KSPSetUpOnBlocks() line 220 in
>> /home/valera/v5PETSc/petsc/petsc/src/ksp/ksp/interface/itfunc.c
>> > > > [1]PETSC ERROR: #9 KSPSolve() line 600 in
>> /home/valera/v5PETSc/petsc/petsc/src/ksp/ksp/interface/itfunc.c
>> > > > At line 333 of file solvelinearmgPETSc.f90
>> > > > Fortran runtime error: Array bound mismatch for dimension 1 of
>> array 'sol' (213120/106560)
>> > > >
>> > > >
>> > > > This code works for -n 1 cores, but it gives this error when using
>> more than one core.
>> > > >
>> > > > What am i missing?
>> > > >
>> > > > Regards,
>> > > >
>> > > > Manuel.
>> > > >
>> > > > <solvelinearmgPETSc.f90>
>> > >
>> > >
>> >
>> >
>> > <n4.png><n2.png><n1.png>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160926/b48799a8/attachment-0001.html>
-------------- next part --------------
1.000000
2.000000
4.000000
3.000000
2.000000
1.000000
3.000000
4.000000
4.000000
3.000000
1.000000
2.000000
3.000000
4.000000
2.000000
1.000000
-------------- next part --------------
A non-text attachment was scrubbed...
Name: solvelinearmgPETSc.f90
Type: text/x-fortran
Size: 15072 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160926/b48799a8/attachment-0001.bin>
-------------- next part --------------
1.000000
2.000000
4.000000
3.000000
1.000000
2.000000
4.000000
3.000000
1.000000
2.000000
4.000000
3.000000
1.000000
2.000000
4.000000
3.000000
-------------- next part --------------
0
4
8
12
16
-------------- next part --------------
0
1
2
3
1
0
3
2
2
3
0
1
3
2
1
0


More information about the petsc-users mailing list