[petsc-users] Errors encountered with KSPsolver with MATNEST and VECNEST

Matthew Knepley knepley at gmail.com
Sat Jul 18 06:51:50 CDT 2020


On Sat, Jul 18, 2020 at 6:11 AM Alex Fleeter <luis.saturday at gmail.com>
wrote:

> I guess using MatNest is still the most memory efficient one, am I
> correct? Does MatGetLocalSubMatrix demand extra memory for generating the
> sub matrices?
>

No. LocalSubMatrix() is just a view into the backing matrix, whether it is
AIJ or Nest, and it forwards the MatSetValues() calls.

   Matt


> On Thu, Jul 16, 2020 at 4:44 AM Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Thu, Jul 16, 2020 at 6:54 AM Yang Juntao <Y.Juntao at hotmail.com> wrote:
>>
>>> Hi, Matt,
>>>
>>>
>>>
>>> In case, I want to put blocks of matrices(assembled independently)
>>> together as the system matrix just as in the example. What would be your
>>> recommended approach?
>>>
>>> I first looked into MatCreateBlockMat, but it only supports sequential
>>> and then I was directed to MATNEST.  Do I have to assemble directly on
>>> “system matrix”.
>>>
>>
>> The idea is that you use
>>
>>
>> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetLocalSubMatrix.html
>>
>> This does require the extra step of calling
>>
>>
>> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSetLocalToGlobalMapping.html#MatSetLocalToGlobalMapping
>>
>> but you must already have this in order to get the global indices.
>>
>> If you use that, then you can assemble locally into AIJ or Nest with the
>> same code.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>>>>>
>>> Matrices of this type are nominally-sparse matrices in which each
>>> "entry" is a Mat
>>> <https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/Mat.html#Mat>
>>> object. Each Mat
>>> <https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/Mat.html#Mat>
>>> must have the same size and be sequential. The local and global sizes must
>>> be compatible with this decomposition. For matrices containing parallel
>>> submatrices and variable block sizes, see MATNEST
>>> <https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATNEST.html#MATNEST>.
>>>
>>>
>>>>>>
>>>
>>>
>>> Regards
>>>
>>> Juntao
>>>
>>>
>>>
>>> *From:* Matthew Knepley <knepley at gmail.com>
>>> *Sent:* Thursday, July 16, 2020 5:56 PM
>>> *To:* Karl Yang <y.juntao at hotmail.com>
>>> *Cc:* Junchao Zhang <junchao.zhang at gmail.com>; petsc-users at mcs.anl.gov
>>> *Subject:* Re: [petsc-users] Errors encountered with KSPsolver with
>>> MATNEST and VECNEST
>>>
>>>
>>>
>>> On Thu, Jul 16, 2020 at 5:43 AM Karl Yang <y.juntao at hotmail.com> wrote:
>>>
>>> Hi, Matt,
>>>
>>>
>>>
>>> Sorry for the confusion. The error shows as below without SetIS().
>>>
>>>
>>>
>>> I see the problem now. It is about the interface for MatNest.
>>>
>>>
>>>
>>> MatNest is intended _only_ as an optimization. I think you should never
>>> have MatNest in user code. Here the
>>>
>>> problem is that MatNest only supports extraction using the ISes it was
>>> created with, whereas detection is
>>>
>>> finding a different IS. I recommend getting everything working with AIJ,
>>> and then use MatSetType() or
>>>
>>> -mat_type at the end to try out Nest. The interfaces are the same.
>>>
>>>
>>>
>>>   Thanks,
>>>
>>>
>>>
>>>     Matt
>>>
>>>
>>>
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>>
>>> [0]PETSC ERROR: Arguments are incompatible
>>>
>>> [0]PETSC ERROR: Could not find index set
>>>
>>> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>>
>>> [0]PETSC ERROR: Petsc Release Version 3.12.5, Mar, 29, 2020
>>>
>>> [0]PETSC ERROR: ./testSolve on a arch-linux2-c-debug named ae3f0ab5a8c4
>>> by Unknown Thu Jul 16 17:36:52 2020
>>>
>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
>>> --with-fc=gfortran --download-mpich --download-fblaslapack --with-cuda
>>>
>>> [0]PETSC ERROR: #1 MatNestFindIS() line 365 in
>>> /usr/local/petsc/petsc-3.12.5/src/mat/impls/nest/matnest.c
>>>
>>> [0]PETSC ERROR: #2 MatNestFindSubMat() line 425 in
>>> /usr/local/petsc/petsc-3.12.5/src/mat/impls/nest/matnest.c
>>>
>>> [0]PETSC ERROR: #3 MatCreateSubMatrix_Nest() line 456 in
>>> /usr/local/petsc/petsc-3.12.5/src/mat/impls/nest/matnest.c
>>>
>>> [0]PETSC ERROR: #4 MatCreateSubMatrix() line 7859 in
>>> /usr/local/petsc/petsc-3.12.5/src/mat/interface/matrix.c
>>>
>>> [0]PETSC ERROR: #5 PCSetUp_FieldSplit() line 676 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/pc/impls/fieldsplit/fieldsplit.c
>>>
>>> [0]PETSC ERROR: #6 PCSetUp() line 894 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/pc/interface/precon.c
>>>
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>>
>>> [0]PETSC ERROR: Corrupt argument:
>>> https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>>>
>>> [0]PETSC ERROR: Invalid Pointer to Object: Parameter # 1
>>>
>>> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>>
>>> [0]PETSC ERROR: Petsc Release Version 3.12.5, Mar, 29, 2020
>>>
>>> [0]PETSC ERROR: ./testSolve on a arch-linux2-c-debug named ae3f0ab5a8c4
>>> by Unknown Thu Jul 16 17:36:52 2020
>>>
>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
>>> --with-fc=gfortran --download-mpich --download-fblaslapack --with-cuda
>>>
>>> [0]PETSC ERROR: #7 MatDestroy() line 1266 in
>>> /usr/local/petsc/petsc-3.12.5/src/mat/interface/matrix.c
>>>
>>> [0]PETSC ERROR: #8 PCSetUp_FieldSplit() line 697 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/pc/impls/fieldsplit/fieldsplit.c
>>>
>>> [0]PETSC ERROR: #9 PCSetUp() line 894 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/pc/interface/precon.c
>>>
>>> [0]PETSC ERROR: #10 KSPSetUp() line 376 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itfunc.c
>>>
>>> KSP Object: 1 MPI processes
>>>
>>>   type: fgmres
>>>
>>>     restart=30, using Classical (unmodified) Gram-Schmidt
>>> Orthogonalization with no iterative refinement
>>>
>>>     happy breakdown tolerance 1e-30
>>>
>>>   maximum iterations=10000, initial guess is zero
>>>
>>>   tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
>>>
>>>   right preconditioning
>>>
>>>   using UNPRECONDITIONED norm type for convergence test
>>>
>>> PC Object: 1 MPI processes
>>>
>>>   type: fieldsplit
>>>
>>>   PC has not been set up so information may be incomplete
>>>
>>>     FieldSplit with Schur preconditioner, blocksize = 1, factorization
>>> FULL
>>>
>>>     Preconditioner for the Schur complement formed from S itself
>>>
>>>     Split info:
>>>
>>>     Split number 0 Defined by IS
>>>
>>>     Split number 1 Defined by IS
>>>
>>>     KSP solver for A00 block
>>>
>>>       KSP Object: (fieldsplit_0_) 1 MPI processes
>>>
>>>         type: preonly
>>>
>>>         maximum iterations=10000, initial guess is zero
>>>
>>>         tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
>>>
>>>         left preconditioning
>>>
>>>         using DEFAULT norm type for convergence test
>>>
>>>       PC Object: (fieldsplit_0_) 1 MPI processes
>>>
>>>         type not yet set
>>>
>>>         PC has not been set up so information may be incomplete
>>>
>>>     KSP solver for upper A00 in upper triangular factor
>>>
>>>         not yet available
>>>
>>>     KSP solver for S = A11 - A10 inv(A00) A01
>>>
>>>         not yet available
>>>
>>>   linear system matrix = precond matrix:
>>>
>>>   Mat Object: 1 MPI processes
>>>
>>>     type: nest
>>>
>>>     rows=15, cols=15
>>>
>>>       Matrix object:
>>>
>>>         type=nest, rows=3, cols=3
>>>
>>>         MatNest structure:
>>>
>>>         (0,0) : type=seqaij, rows=5, cols=5
>>>
>>>         (0,1) : NULL
>>>
>>>         (0,2) : type=seqaij, rows=5, cols=5
>>>
>>>         (1,0) : NULL
>>>
>>>         (1,1) : type=seqaij, rows=5, cols=5
>>>
>>>         (1,2) : type=seqaij, rows=5, cols=5
>>>
>>>         (2,0) : type=seqaij, rows=5, cols=5
>>>
>>>         (2,1) : type=seqaij, rows=5, cols=5
>>>
>>>         (2,2) : NULL
>>>
>>> Segmentation fault (core dumped)
>>>
>>>
>>>
>>> Attached is the test code.
>>>
>>>
>>>
>>> Regards
>>>
>>> JT
>>>
>>>
>>>
>>>
>>> On Jul 16 2020, at 5:30 pm, Matthew Knepley <knepley at gmail.com> wrote:
>>>
>>> On Thu, Jul 16, 2020 at 1:23 AM Karl Yang <y.juntao at hotmail.com
>>> <https://link.getmailspring.com/link/1D2D2990-B215-4510-B196-F29BD06D4684@getmailspring.com/0?redirect=mailto%3Ay.juntao%40hotmail.com&recipient=a25lcGxleUBnbWFpbC5jb20%3D>>
>>> wrote:
>>>
>>> [image: Sent from Mailspring]
>>>
>>> Hi, Junchao,
>>>
>>>
>>>
>>> Thank you. It works. :)
>>>
>>>
>>>
>>> May I ask some follow up question regarding the same example about
>>> fieldsplit. I am trying to learn about matnest with fieldsplit
>>> preconditioner on manual page 94. But I get a bit lost trying to understand
>>> fields, ISs once it is not eactly the same on the manual.
>>>
>>>
>>>
>>> In this same example, I tried to use PCFIELDSPLIT and
>>> PCFieldSplitSetDetectSaddlePoint as follows.
>>>
>>>
>>>
>>> You cannot mix DetectSaddlePoint() with SetIS() since the detection
>>> tries to make the proper ISes. Maybe we should check that.
>>>
>>>
>>>
>>>   Thanks,
>>>
>>>
>>>
>>>      Matt
>>>
>>>
>>>
>>> IS rows[3];
>>>
>>> IS cols[3];
>>>
>>> MatNestGetISs(testMesh->system, rows, cols);
>>>
>>>
>>>
>>> KSP ksp;
>>>
>>> PC  pc;
>>>
>>> KSPCreate(PETSC_COMM_WORLD, &ksp);
>>>
>>> KSPSetType(ksp, KSPFGMRES);
>>>
>>> KSPSetOperators(ksp, testMesh->system, testMesh->system);
>>>
>>> KSPGetPC(ksp, &pc);
>>>
>>> PCFieldSplitSetIS(pc, "u", rows[0]);
>>>
>>> PCFieldSplitSetIS(pc, "v", rows[1]);
>>>
>>> PCFieldSplitSetIS(pc, "p", rows[2]);
>>>
>>> PCSetType(pc, PCFIELDSPLIT);
>>>
>>> PCFieldSplitSetDetectSaddlePoint(pc, PETSC_TRUE);
>>>
>>> KSPSetUp(ksp);
>>>
>>> KSPSolve(ksp, testMesh->rhs, testMesh->solution);
>>>
>>>
>>>
>>> But I will get the following error,
>>>
>>>
>>>
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>>
>>> [0]PETSC ERROR: Arguments are incompatible
>>>
>>> [0]PETSC ERROR: Could not find index set
>>>
>>> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for
>>> trouble shooting.
>>>
>>> [0]PETSC ERROR: Petsc Release Version 3.12.5, Mar, 29, 2020
>>>
>>> [0]PETSC ERROR: ./testSolve on a arch-linux2-c-debug named f601294ac804
>>> by Unknown Thu Jul 16 13:08:39 2020
>>>
>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
>>> --with-fc=gfortran --download-mpich --download-fblaslapack --with-cuda
>>>
>>> [0]PETSC ERROR: #1 MatNestFindIS() line 365 in
>>> /usr/local/petsc/petsc-3.12.5/src/mat/impls/nest/matnest.c
>>>
>>> [0]PETSC ERROR: #2 MatNestFindSubMat() line 425 in
>>> /usr/local/petsc/petsc-3.12.5/src/mat/impls/nest/matnest.c
>>>
>>> [0]PETSC ERROR: #3 MatCreateSubMatrix_Nest() line 456 in
>>> /usr/local/petsc/petsc-3.12.5/src/mat/impls/nest/matnest.c
>>>
>>> [0]PETSC ERROR: #4 MatCreateSubMatrix() line 7859 in
>>> /usr/local/petsc/petsc-3.12.5/src/mat/interface/matrix.c
>>>
>>> [0]PETSC ERROR: #5 PCSetUp_FieldSplit() line 676 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/pc/impls/fieldsplit/fieldsplit.c
>>>
>>> [0]PETSC ERROR: #6 PCSetUp() line 894 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/pc/interface/precon.c
>>>
>>> [0]PETSC ERROR: #7 KSPSetUp() line 376 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itfunc.c
>>>
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>>
>>> [0]PETSC ERROR: Corrupt argument:
>>> https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>>>
>>> [0]PETSC ERROR: Invalid Pointer to Object: Parameter # 1
>>>
>>> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for
>>> trouble shooting.
>>>
>>> [0]PETSC ERROR: Petsc Release Version 3.12.5, Mar, 29, 2020
>>>
>>> [0]PETSC ERROR: ./testSolve on a arch-linux2-c-debug named f601294ac804
>>> by Unknown Thu Jul 16 13:08:39 2020
>>>
>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
>>> --with-fc=gfortran --download-mpich --download-fblaslapack --with-cuda
>>>
>>> [0]PETSC ERROR: #8 MatDestroy() line 1266 in
>>> /usr/local/petsc/petsc-3.12.5/src/mat/interface/matrix.c
>>>
>>> [0]PETSC ERROR: #9 PCSetUp_FieldSplit() line 697 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/pc/impls/fieldsplit/fieldsplit.c
>>>
>>> [0]PETSC ERROR: #10 PCSetUp() line 894 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/pc/interface/precon.c
>>>
>>> [0]PETSC ERROR: #11 KSPSetUp() line 376 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itfunc.c
>>>
>>> [0]PETSC ERROR: #12 KSPSolve() line 703 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itfunc.c
>>>
>>>
>>>
>>> I read about the examples on the manual and mostly are based on 2*2
>>> blocks but still cannot figure out how pcfieldsplit should be set in this 3*
>>>  **3 blocks, and how IS should be set accordingly. I read about
>>> -fieldsplit_1_pc_type none option suggested on the documentation, but I am
>>> not sure is it still valid in a 33 blocks.*
>>>
>>>
>>>
>>> *Regards*
>>>
>>> *JT*
>>>
>>> On Jul 16 2020, at 6:57 am, Junchao Zhang <junchao.zhang at gmail.com>
>>> wrote:
>>>
>>> Adding the MatNestSetVecType() line fixed the problem.  But I don't know
>>> why petsc had this weird design originally.
>>>
>>>
>>>
>>>       MatCreateNest(PETSC_COMM_WORLD, 3, NULL, 3, NULL, mblock,
>>> &testMesh->system);
>>>
>>>  +   MatNestSetVecType(testMesh->system,VECNEST);
>>>
>>>
>>>
>>> --Junchao Zhang
>>>
>>>
>>>
>>>
>>>
>>> On Wed, Jul 15, 2020 at 2:16 PM Karl Yang <y.juntao at hotmail.com> wrote:
>>>
>>> Hi, Barry,
>>>
>>>
>>>
>>> I created a minimum example to reproduce the error. The example code is
>>> attached.
>>>
>>>
>>>
>>> Mat Object: 1 MPI processes
>>>
>>>   type: nest
>>>
>>>   Matrix object:
>>>
>>>     type=nest, rows=3, cols=3
>>>
>>>     MatNest structure:
>>>
>>>     (0,0) : type=seqaij, rows=5, cols=5
>>>
>>>     (0,1) : NULL
>>>
>>>     (0,2) : type=seqaij, rows=5, cols=5
>>>
>>>     (1,0) : NULL
>>>
>>>     (1,1) : type=seqaij, rows=5, cols=5
>>>
>>>     (1,2) : type=seqaij, rows=5, cols=5
>>>
>>>     (2,0) : type=seqaij, rows=5, cols=5
>>>
>>>     (2,1) : type=seqaij, rows=5, cols=5
>>>
>>>     (2,2) : NULL
>>>
>>> Vec Object: 1 MPI processes
>>>
>>>   type: nest
>>>
>>>   VecNest, rows=3,  structure:
>>>
>>>   (0) : type=seq, rows=5
>>>
>>>     Vec Object: 1 MPI processes
>>>
>>>       type: seq
>>>
>>>     1.
>>>
>>>     1.
>>>
>>>     1.
>>>
>>>     1.
>>>
>>>     1.
>>>
>>>   (1) : type=seq, rows=5
>>>
>>>     Vec Object: 1 MPI processes
>>>
>>>       type: seq
>>>
>>>     1.
>>>
>>>     1.
>>>
>>>     1.
>>>
>>>     1.
>>>
>>>     1.
>>>
>>>   (2) : type=seq, rows=5
>>>
>>>     Vec Object: 1 MPI processes
>>>
>>>       type: seq
>>>
>>>     1.
>>>
>>>     1.
>>>
>>>     1.
>>>
>>>     1.
>>>
>>>     1.
>>>
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>>
>>> [0]PETSC ERROR: Invalid argument
>>>
>>> [0]PETSC ERROR: Nest vector argument 2 not setup.
>>>
>>> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for
>>> trouble shooting.
>>>
>>> [0]PETSC ERROR: Petsc Release Version 3.12.5, Mar, 29, 2020
>>>
>>> [0]PETSC ERROR: ./testSolve on a arch-linux2-c-debug named f601294ac804
>>> by Unknown Thu Jul 16 03:11:02 2020
>>>
>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
>>> --with-fc=gfortran --download-mpich --download-fblaslapack --with-cuda
>>>
>>> [0]PETSC ERROR: #1 VecCopy_Nest() line 68 in
>>> /usr/local/petsc/petsc-3.12.5/src/vec/vec/impls/nest/vecnest.c
>>>
>>> [0]PETSC ERROR: #2 VecCopy() line 1577 in
>>> /usr/local/petsc/petsc-3.12.5/src/vec/vec/interface/vector.c
>>>
>>> [0]PETSC ERROR: #3 KSPInitialResidual() line 61 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itres.c
>>>
>>> [0]PETSC ERROR: #4 KSPSolve_GMRES() line 236 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/impls/gmres/gmres.c
>>>
>>> [0]PETSC ERROR: #5 KSPSolve() line 760 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itfunc.c
>>>
>>> On Jul 16 2020, at 2:39 am, Barry Smith <bsmith at petsc.dev> wrote:
>>>
>>>
>>>
>>>   Could be a bug in PETSc, please send a sample program that produces
>>> the problem.
>>>
>>>
>>>
>>>   Barry
>>>
>>>
>>>
>>>
>>>
>>> On Jul 15, 2020, at 1:28 PM, Karl Yang <y.juntao at hotmail.com
>>> <https://link.getmailspring.com/link/060AB2C3-F3D7-408C-99CF-C4D1B27DD59F@getmailspring.com/0?redirect=mailto%3Ay.juntao%40hotmail.com&recipient=cGV0c2MtdXNlcnNAbWNzLmFubC5nb3Y%3D>>
>>> wrote:
>>>
>>>
>>>
>>> Hi, Barry,
>>>
>>>
>>>
>>> Thanks for your reply. I output the matrix and vec with matview and
>>> vecview. And I think I do have the values set as it has been shown.
>>>
>>>
>>>
>>> Mat Object: 1 MPI processes
>>>
>>>   type: nest
>>>
>>>   Matrix object:
>>>
>>>     type=nest, rows=3, cols=3
>>>
>>>     MatNest structure:
>>>
>>>     (0,0) : type=seqaij, rows=25, cols=25
>>>
>>>     (0,1) : NULL
>>>
>>>     (0,2) : type=seqaij, rows=25, cols=25
>>>
>>>     (1,0) : NULL
>>>
>>>     (1,1) : type=seqaij, rows=25, cols=25
>>>
>>>     (1,2) : type=seqaij, rows=25, cols=25
>>>
>>>     (2,0) : type=seqaij, rows=25, cols=25
>>>
>>>     (2,1) : type=seqaij, rows=25, cols=25
>>>
>>>     (2,2) : NULL
>>>
>>> Vec Object: 1 MPI processes
>>>
>>>   type: nest
>>>
>>>   VecNest, rows=3,  structure:
>>>
>>>   (0) : type=seq, rows=25
>>>
>>>     Vec Object: 1 MPI processes
>>>
>>>       type: seq
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     75.8279
>>>
>>>     -51.0776
>>>
>>>     -75.8279
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     -90.8404
>>>
>>>     58.7011
>>>
>>>     90.8404
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     -75.8279
>>>
>>>     51.0776
>>>
>>>     75.8279
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>   (1) : name="Vec_0x84000000_0", type=seq, rows=25
>>>
>>>     Vec Object: Vec_0x84000000_0 1 MPI processes
>>>
>>>       type: seq
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     75.8279
>>>
>>>     -51.0776
>>>
>>>     -75.8279
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     -90.8404
>>>
>>>     58.7011
>>>
>>>     90.8404
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     -75.8279
>>>
>>>     51.0776
>>>
>>>     75.8279
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>   (2) : type=seq, rows=25
>>>
>>>     Vec Object: 1 MPI processes
>>>
>>>       type: seq
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>>     0.
>>>
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>>
>>> [0]PETSC ERROR: Invalid argument
>>>
>>> [0]PETSC ERROR: Nest vector argument 4 not setup.
>>>
>>> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
>>> <https://link.getmailspring.com/link/060AB2C3-F3D7-408C-99CF-C4D1B27DD59F@getmailspring.com/1?redirect=https%3A%2F%2Fwww.mcs.anl.gov%2Fpetsc%2Fdocumentation%2Ffaq.html&recipient=cGV0c2MtdXNlcnNAbWNzLmFubC5nb3Y%3D> for
>>> trouble shooting.
>>>
>>> [0]PETSC ERROR: Petsc Release Version 3.12.5, Mar, 29, 2020
>>>
>>> [0]PETSC ERROR: ./testSolve on a arch-linux2-c-debug named f601294ac804
>>> by Unknown Thu Jul 16 02:15:26 2020
>>>
>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
>>> --with-fc=gfortran --download-mpich --download-fblaslapack --with-cuda
>>>
>>> [0]PETSC ERROR: #1 VecPointwiseMult_Nest() line 245 in
>>> /usr/local/petsc/petsc-3.12.5/src/vec/vec/impls/nest/vecnest.c
>>>
>>> [0]PETSC ERROR: #2 VecPointwiseMult() line 1107 in
>>> /usr/local/petsc/petsc-3.12.5/src/vec/vec/interface/vector.c
>>>
>>> [0]PETSC ERROR: #3 PCApply_Jacobi() line 272 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/pc/impls/jacobi/jacobi.c
>>>
>>> [0]PETSC ERROR: #4 PCApply() line 444 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/pc/interface/precon.c
>>>
>>> [0]PETSC ERROR: #5 KSP_PCApply() line 281 in
>>> /usr/local/petsc/petsc-3.12.5/include/petsc/private/kspimpl.h
>>>
>>> [0]PETSC ERROR: #6 KSPInitialResidual() line 65 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itres.c
>>>
>>> [0]PETSC ERROR: #7 KSPSolve_GMRES() line 236 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/impls/gmres/gmres.c
>>>
>>> [0]PETSC ERROR: #8 KSPSolve() line 760 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itfunc.c
>>>
>>>
>>>
>>> On Jul 16 2020, at 2:13 am, Barry Smith <bsmith at petsc.dev
>>> <https://link.getmailspring.com/link/060AB2C3-F3D7-408C-99CF-C4D1B27DD59F@getmailspring.com/2?redirect=mailto%3Absmith%40petsc.dev&recipient=cGV0c2MtdXNlcnNAbWNzLmFubC5nb3Y%3D>>
>>> wrote:
>>>
>>>
>>>
>>>   From the error message the second vector you are using to form the
>>> nest vector has not been completely created/set up. Make sure you called
>>> either VecCreateSeq(), VecCreateMPI() or did VecCreate(), VecSetType(),
>>> VecSetUp() and if it is the right hand side make sure you put numerical
>>> values in it.
>>>
>>>
>>>
>>>    Barry
>>>
>>>
>>>
>>>
>>>
>>> On Jul 15, 2020, at 1:05 PM, Karl Yang <y.juntao at hotmail.com
>>> <https://link.getmailspring.com/link/060AB2C3-F3D7-408C-99CF-C4D1B27DD59F@getmailspring.com/3?redirect=https%3A%2F%2Flink.getmailspring.com%2Flink%2F540F752A-DB59-4878-9C98-B8F4B4D9FD0D%40getmailspring.com%2F0%3Fredirect%3Dmailto%253Ay.juntao%2540hotmail.com%26recipient%3DYnNtaXRoQHBldHNjLmRldg%253D%253D&recipient=cGV0c2MtdXNlcnNAbWNzLmFubC5nb3Y%3D>>
>>> wrote:
>>>
>>>
>>>
>>> Hello,
>>>
>>>
>>>
>>> I'm trying to solve a 2*2 block matrix system.
>>>
>>> I managed to form a 2*2 MatNest and a 21 VecNest. However when I try to
>>> make use of ksp solve, I encounter the following error. There is only one
>>> or two examples on MatNest and VecNest. I could not figure out what could
>>> be the trouble.
>>>
>>>
>>>
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>>
>>> [0]PETSC ERROR: Invalid argument
>>>
>>> [0]PETSC ERROR: Nest vector argument 2 not setup.
>>>
>>> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
>>> <https://link.getmailspring.com/link/060AB2C3-F3D7-408C-99CF-C4D1B27DD59F@getmailspring.com/4?redirect=https%3A%2F%2Flink.getmailspring.com%2Flink%2F540F752A-DB59-4878-9C98-B8F4B4D9FD0D%40getmailspring.com%2F1%3Fredirect%3Dhttps%253A%252F%252Fwww.mcs.anl.gov%252Fpetsc%252Fdocumentation%252Ffaq.html%26recipient%3DYnNtaXRoQHBldHNjLmRldg%253D%253D&recipient=cGV0c2MtdXNlcnNAbWNzLmFubC5nb3Y%3D> for
>>> trouble shooting.
>>>
>>> [0]PETSC ERROR: Petsc Release Version 3.12.5, Mar, 29, 2020
>>>
>>> [0]PETSC ERROR: ./testSolve on a arch-linux2-c-debug named f601294ac804
>>> by Unknown Thu Jul 16 01:56:11 2020
>>>
>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
>>> --with-fc=gfortran --download-mpich --download-fblaslapack --with-cuda
>>>
>>> [0]PETSC ERROR: #1 VecCopy_Nest() line 68 in
>>> /usr/local/petsc/petsc-3.12.5/src/vec/vec/impls/nest/vecnest.c
>>>
>>> [0]PETSC ERROR: #2 VecCopy() line 1577 in
>>> /usr/local/petsc/petsc-3.12.5/src/vec/vec/interface/vector.c
>>>
>>> [0]PETSC ERROR: #3 KSPInitialResidual() line 61 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itres.c
>>>
>>> [0]PETSC ERROR: #4 KSPSolve_GMRES() line 236 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/impls/gmres/gmres.c
>>>
>>> [0]PETSC ERROR: #5 KSPSolve() line 760 in
>>> /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itfunc.c
>>>
>>>
>>>
>>> Regards
>>>
>>> JT
>>>
>>>
>>>
>>>
>>>
>>> --
>>>
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>>
>>> -- Norbert Wiener
>>>
>>>
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <https://link.getmailspring.com/link/1D2D2990-B215-4510-B196-F29BD06D4684@getmailspring.com/1?redirect=http%3A%2F%2Fwww.cse.buffalo.edu%2F~knepley%2F&recipient=a25lcGxleUBnbWFpbC5jb20%3D>
>>>
>>>
>>>
>>>
>>> --
>>>
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>>
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200718/e218791d/attachment-0001.html>


More information about the petsc-users mailing list