[petsc-users] Errors encountered with KSPsolver with MATNEST and VECNEST
Karl Yang
y.juntao at hotmail.com
Wed Jul 15 14:16:00 CDT 2020
Hi, Barry,
I created a minimum example to reproduce the error. The example code is attached.
Mat Object: 1 MPI processes
type: nest
Matrix object:
type=nest, rows=3, cols=3
MatNest structure:
(0,0) : type=seqaij, rows=5, cols=5
(0,1) : NULL
(0,2) : type=seqaij, rows=5, cols=5
(1,0) : NULL
(1,1) : type=seqaij, rows=5, cols=5
(1,2) : type=seqaij, rows=5, cols=5
(2,0) : type=seqaij, rows=5, cols=5
(2,1) : type=seqaij, rows=5, cols=5
(2,2) : NULL
Vec Object: 1 MPI processes
type: nest
VecNest, rows=3, structure:
(0) : type=seq, rows=5
Vec Object: 1 MPI processes
type: seq
1.
1.
1.
1.
1.
(1) : type=seq, rows=5
Vec Object: 1 MPI processes
type: seq
1.
1.
1.
1.
1.
(2) : type=seq, rows=5
Vec Object: 1 MPI processes
type: seq
1.
1.
1.
1.
1.
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Invalid argument
[0]PETSC ERROR: Nest vector argument 2 not setup.
[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.12.5, Mar, 29, 2020
[0]PETSC ERROR: ./testSolve on a arch-linux2-c-debug named f601294ac804 by Unknown Thu Jul 16 03:11:02 2020
[0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --download-fblaslapack --with-cuda
[0]PETSC ERROR: #1 VecCopy_Nest() line 68 in /usr/local/petsc/petsc-3.12.5/src/vec/vec/impls/nest/vecnest.c
[0]PETSC ERROR: #2 VecCopy() line 1577 in /usr/local/petsc/petsc-3.12.5/src/vec/vec/interface/vector.c
[0]PETSC ERROR: #3 KSPInitialResidual() line 61 in /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: #4 KSPSolve_GMRES() line 236 in /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: #5 KSPSolve() line 760 in /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itfunc.c
On Jul 16 2020, at 2:39 am, Barry Smith <bsmith at petsc.dev> wrote:
>
> Could be a bug in PETSc, please send a sample program that produces the problem.
>
> Barry
>
>
> > On Jul 15, 2020, at 1:28 PM, Karl Yang <y.juntao at hotmail.com (https://link.getmailspring.com/link/060AB2C3-F3D7-408C-99CF-C4D1B27DD59F@getmailspring.com/0?redirect=mailto%3Ay.juntao%40hotmail.com&recipient=cGV0c2MtdXNlcnNAbWNzLmFubC5nb3Y%3D)> wrote:
> > Hi, Barry,
> > Thanks for your reply. I output the matrix and vec with matview and vecview. And I think I do have the values set as it has been shown.
> > Mat Object: 1 MPI processes
> > type: nest
> > Matrix object:
> > type=nest, rows=3, cols=3
> > MatNest structure:
> > (0,0) : type=seqaij, rows=25, cols=25
> > (0,1) : NULL
> > (0,2) : type=seqaij, rows=25, cols=25
> > (1,0) : NULL
> > (1,1) : type=seqaij, rows=25, cols=25
> > (1,2) : type=seqaij, rows=25, cols=25
> > (2,0) : type=seqaij, rows=25, cols=25
> > (2,1) : type=seqaij, rows=25, cols=25
> > (2,2) : NULL
> > Vec Object: 1 MPI processes
> > type: nest
> > VecNest, rows=3, structure:
> > (0) : type=seq, rows=25
> > Vec Object: 1 MPI processes
> > type: seq
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 75.8279
> > -51.0776
> > -75.8279
> > 0.
> > 0.
> > -90.8404
> > 58.7011
> > 90.8404
> > 0.
> > 0.
> > -75.8279
> > 51.0776
> > 75.8279
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > (1) : name="Vec_0x84000000_0", type=seq, rows=25
> > Vec Object: Vec_0x84000000_0 1 MPI processes
> > type: seq
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 75.8279
> > -51.0776
> > -75.8279
> > 0.
> > 0.
> > -90.8404
> > 58.7011
> > 90.8404
> > 0.
> > 0.
> > -75.8279
> > 51.0776
> > 75.8279
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > (2) : type=seq, rows=25
> > Vec Object: 1 MPI processes
> > type: seq
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > 0.
> > [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> > [0]PETSC ERROR: Invalid argument
> > [0]PETSC ERROR: Nest vector argument 4 not setup.
> > [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html (https://link.getmailspring.com/link/060AB2C3-F3D7-408C-99CF-C4D1B27DD59F@getmailspring.com/1?redirect=https%3A%2F%2Fwww.mcs.anl.gov%2Fpetsc%2Fdocumentation%2Ffaq.html&recipient=cGV0c2MtdXNlcnNAbWNzLmFubC5nb3Y%3D) for trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.12.5, Mar, 29, 2020
> > [0]PETSC ERROR: ./testSolve on a arch-linux2-c-debug named f601294ac804 by Unknown Thu Jul 16 02:15:26 2020
> > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --download-fblaslapack --with-cuda
> > [0]PETSC ERROR: #1 VecPointwiseMult_Nest() line 245 in /usr/local/petsc/petsc-3.12.5/src/vec/vec/impls/nest/vecnest.c
> > [0]PETSC ERROR: #2 VecPointwiseMult() line 1107 in /usr/local/petsc/petsc-3.12.5/src/vec/vec/interface/vector.c
> > [0]PETSC ERROR: #3 PCApply_Jacobi() line 272 in /usr/local/petsc/petsc-3.12.5/src/ksp/pc/impls/jacobi/jacobi.c
> > [0]PETSC ERROR: #4 PCApply() line 444 in /usr/local/petsc/petsc-3.12.5/src/ksp/pc/interface/precon.c
> > [0]PETSC ERROR: #5 KSP_PCApply() line 281 in /usr/local/petsc/petsc-3.12.5/include/petsc/private/kspimpl.h
> > [0]PETSC ERROR: #6 KSPInitialResidual() line 65 in /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itres.c
> > [0]PETSC ERROR: #7 KSPSolve_GMRES() line 236 in /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/impls/gmres/gmres.c
> > [0]PETSC ERROR: #8 KSPSolve() line 760 in /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itfunc.c
> >
> > On Jul 16 2020, at 2:13 am, Barry Smith <bsmith at petsc.dev (https://link.getmailspring.com/link/060AB2C3-F3D7-408C-99CF-C4D1B27DD59F@getmailspring.com/2?redirect=mailto%3Absmith%40petsc.dev&recipient=cGV0c2MtdXNlcnNAbWNzLmFubC5nb3Y%3D)> wrote:
> > >
> > > From the error message the second vector you are using to form the nest vector has not been completely created/set up. Make sure you called either VecCreateSeq(), VecCreateMPI() or did VecCreate(), VecSetType(), VecSetUp() and if it is the right hand side make sure you put numerical values in it.
> > >
> > > Barry
> > >
> > >
> > > > On Jul 15, 2020, at 1:05 PM, Karl Yang <y.juntao at hotmail.com (https://link.getmailspring.com/link/060AB2C3-F3D7-408C-99CF-C4D1B27DD59F@getmailspring.com/3?redirect=https%3A%2F%2Flink.getmailspring.com%2Flink%2F540F752A-DB59-4878-9C98-B8F4B4D9FD0D%40getmailspring.com%2F0%3Fredirect%3Dmailto%253Ay.juntao%2540hotmail.com%26recipient%3DYnNtaXRoQHBldHNjLmRldg%253D%253D&recipient=cGV0c2MtdXNlcnNAbWNzLmFubC5nb3Y%3D)> wrote:
> > > > Hello,
> > > > I'm trying to solve a 2*2 block matrix system.
> > > > I managed to form a 2*2 MatNest and a 21 VecNest. However when I try to make use of ksp solve, I encounter the following error. There is only one or two examples on MatNest and VecNest. I could not figure out what could be the trouble.
> > > >
> > > >
> > > > [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> > > > [0]PETSC ERROR: Invalid argument
> > > > [0]PETSC ERROR: Nest vector argument 2 not setup.
> > > > [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html (https://link.getmailspring.com/link/060AB2C3-F3D7-408C-99CF-C4D1B27DD59F@getmailspring.com/4?redirect=https%3A%2F%2Flink.getmailspring.com%2Flink%2F540F752A-DB59-4878-9C98-B8F4B4D9FD0D%40getmailspring.com%2F1%3Fredirect%3Dhttps%253A%252F%252Fwww.mcs.anl.gov%252Fpetsc%252Fdocumentation%252Ffaq.html%26recipient%3DYnNtaXRoQHBldHNjLmRldg%253D%253D&recipient=cGV0c2MtdXNlcnNAbWNzLmFubC5nb3Y%3D) for trouble shooting.
> > > > [0]PETSC ERROR: Petsc Release Version 3.12.5, Mar, 29, 2020
> > > > [0]PETSC ERROR: ./testSolve on a arch-linux2-c-debug named f601294ac804 by Unknown Thu Jul 16 01:56:11 2020
> > > > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --download-fblaslapack --with-cuda
> > > > [0]PETSC ERROR: #1 VecCopy_Nest() line 68 in /usr/local/petsc/petsc-3.12.5/src/vec/vec/impls/nest/vecnest.c
> > > > [0]PETSC ERROR: #2 VecCopy() line 1577 in /usr/local/petsc/petsc-3.12.5/src/vec/vec/interface/vector.c
> > > > [0]PETSC ERROR: #3 KSPInitialResidual() line 61 in /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itres.c
> > > > [0]PETSC ERROR: #4 KSPSolve_GMRES() line 236 in /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/impls/gmres/gmres.c
> > > > [0]PETSC ERROR: #5 KSPSolve() line 760 in /usr/local/petsc/petsc-3.12.5/src/ksp/ksp/interface/itfunc.c
> > > >
> > > >
> > > > Regards
> > > > JT
> > > >
> > >
> > >
> > >
> >
> >
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200716/b4989bb8/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: test.cpp
Type: application/octet-stream
Size: 3266 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200716/b4989bb8/attachment-0001.obj>
More information about the petsc-users
mailing list