[petsc-users] Problems in use of VecNest in Petsc?
Rochan Upadhyay
rochan.upadhyay at gmail.com
Sat Jul 15 18:08:51 CDT 2023
Dear Petsc developers,
Is the use of VecNest format supported for field-split type preconditioner
or for use with MatNest formats ?
A quick change to example snes/tutorials/ex70.c seems to suggest that it is
not possible.
For example the changes :
Vec subx[2], subb[2]; (declared in Stokes struct)
In routine StokesSetupVectors(Stokes *s) :
MatCreateVecs(s->subA[0], &s->subx[0], &s->subb[0]);
MatCreateVecs(s->subA[3], &s->subx[1], &s->subb[1]);
//
IS is_rows[2];
MatNestGetISs(s->A, is_rows, NULL);
//
VecCreateNest(PETSC_COMM_WORLD, 2, is_rows, s->subx, &s->x);
VecCreateNest(PETSC_COMM_WORLD, 2, is_rows, s->subb, &s->b);
VecAssemblyBegin(s->x);
VecAssemblyEnd(s->x);
VecAssemblyBegin(s->b);
VecAssemblyEnd(s->b);
....
The serial case runs fine but the parallel with np = 2 fails with the error
message copied below. Could you find out what I am doing wrong ? In th
error message, which vector(s) needs assembling ?
I would normally not use VecNest but many finite element packages that
allow hooking up with petsc seem to have their block vectors of Petsc
VecType VecNest
Any help/insights would be appreciated.
Regards,
Rochan
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Not for unassembled vector, did you call
VecAssemblyBegin()/VecAssemblyEnd()?
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.19.3, Jun 29, 2023
[0]PETSC ERROR: ./ex70 on a named localhost.localdomain by user Sat Jul 15
17:51:01 2023
[0]PETSC ERROR: Configure options --with-clean=yes --with-debugging=no
--with-mpi=1 --with-mpi-dir=/home/user/software_installs/mpich-4.0.2/
--prefix=/home/user/software_installs/petsc-3.19.3/
--PETSC_ARCH=linux-gnu-opt
--PETSC_DIR=/home/user/software_downloads/petsc-3.19.3/
--with-cxx-dialect=C++11 --with-hypre=1 --download-hypre=yes
--with-suitesparse=1 --download-suitesparse=yes --with-metis=1
--download-metis=yes --with-shared-libraries=1 --with-zlib=1
--download-zlib=no --with-superlu_dist=1 --download-superlu_dist=yes
--with-mumps=1 --download-mumps=yes --with-parmetis=1
--download-parmetis=yes --with-hdf5=yes --download-hdf5=1 --with-chaco=1
--download-chaco=yes --with-scalapack=1 --download-scalapack=yes
--with-p4est=1 --download-p4est=yes --download-fblaslapack=1
--with-strumpack=1 --download-strumpack=yes
[0]PETSC ERROR: #1 VecCopy() at
/home/user/software_downloads/petsc-3.19.3/src/vec/vec/interface/vector.c:1684
[0]PETSC ERROR: #2 VecCopy_Nest() at
/home/user/software_downloads/petsc-3.19.3/src/vec/vec/impls/nest/vecnest.c:58
[0]PETSC ERROR: #3 VecCopy() at
/home/user/software_downloads/petsc-3.19.3/src/vec/vec/interface/vector.c:1723
[0]PETSC ERROR: #4 KSPInitialResidual() at
/home/user/software_downloads/petsc-3.19.3/src/ksp/ksp/interface/itres.c:60
[0]PETSC ERROR: #5 KSPSolve_GMRES() at
/home/user/software_downloads/petsc-3.19.3/src/ksp/ksp/impls/gmres/gmres.c:226
[0]PETSC ERROR: #6 KSPSolve_Private() at
/home/user/software_downloads/petsc-3.19.3/src/ksp/ksp/interface/itfunc.c:898
[0]PETSC ERROR: #7 KSPSolve() at
/home/user/software_downloads/petsc-3.19.3/src/ksp/ksp/interface/itfunc.c:1070
[0]PETSC ERROR: #8 main() at /home/user/Projects/Misc/Petsc/SNES/ex70.c:684
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -ksp_monitor (source: command line)
[0]PETSC ERROR: -nx 4 (source: command line)
[0]PETSC ERROR: -ny 8 (source: command line)
[0]PETSC ERROR: ----------------End of Error Message -------send entire
error message to petsc-maint at mcs.anl.gov----------
Abort(73) on node 0 (rank 0 in comm 16): application called
MPI_Abort(MPI_COMM_SELF, 73) - process 0
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230715/1feb456c/attachment.html>
More information about the petsc-users
mailing list