[petsc-users] On the usage of MatSetValuesBlocked

Edoardo alinovi edoardo.alinovi at gmail.com
Fri Nov 4 03:32:13 CDT 2022


It is working like a charm now!

Is it mandatory to use VecSetValuesBlocked to assemble the rhs? Does the
Vec need to be of any other type than VECMPI?

I am assembling it like this:
                brhs(1:3-bdim) = this%Ueqn%bC(iElement,1:3-bdim)
                brhs(4-bdim) = this%Peqn%bC(iElement,1)
                call VecSetValuesBlocked(this%rhs, 1,
mesh%cellGlobalAddr(iElement)-1, brhs, INSERT_VALUES, ierr)

But I am getting into troubles:

[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: *PetscSegBufferAlloc_Private*
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.18.0, Sep 30, 2022
[0]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Fri Nov  4
09:31:03 2022
[0]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3
COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1
-download-superlu_dist -download-mumps -download-hypre -download-metis
-download-parmetis -download-scalapack -download-ml -download-slepc
-download-hpddm -download-cmake
-with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/
[0]PETSC ERROR: #1 PetscMallocAlign() at
/home/edo/software/petsc-3.18.0/src/sys/memory/mal.c:55
[0]PETSC ERROR: #2 PetscSegBufferAlloc_Private() at
/home/edo/software/petsc-3.18.0/src/sys/utils/segbuffer.c:31
[0]PETSC ERROR: #3 PetscSegBufferGet() at
/home/edo/software/petsc-3.18.0/src/sys/utils/segbuffer.c:94
[1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[1]PETSC ERROR: General MPI error
[1]PETSC ERROR: MPI error 1 MPI_ERR_BUFFER: invalid buffer pointer
[1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.18.0, Sep 30, 2022
[1]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Fri Nov  4
09:31:03 2022
[1]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3
COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1
-download-superlu_dist -download-mumps -download-hypre -download-metis
-download-parmetis -download-scalapack -download-ml -download-slepc
-download-hpddm -download-cmake
-with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/
[1]PETSC ERROR: #1 VecAssemblySend_MPI_Private() at
/home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:133
[1]PETSC ERROR: #2 PetscCommBuildTwoSidedFReq_Reference() at
/home/edo/software/petsc-3.18.0/src/sys/utils/mpits.c:314
[1]PETSC ERROR: #3 PetscCommBuildTwoSidedFReq() at
/home/edo/software/petsc-3.18.0/src/sys/utils/mpits.c:526
[1]PETSC ERROR: [0]PETSC ERROR: #4 VecAssemblyRecv_MPI_Private() at
/home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:164
[0]PETSC ERROR: #5 PetscCommBuildTwoSidedFReq_Reference() at
/home/edo/software/petsc-3.18.0/src/sys/utils/mpits.c:320
[0]PETSC ERROR: #6 PetscCommBuildTwoSidedFReq() at
/home/edo/software/petsc-3.18.0/src/sys/utils/mpits.c:526
[0]PETSC ERROR: #7 VecAssemblyBegin_MPI_BTS() at
/home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:238
#4 VecAssemblyBegin_MPI_BTS() at
/home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:238
[1]PETSC ERROR: #5 VecAssemblyBegin() at
/home/edo/software/petsc-3.18.0/src/vec/vec/interface/vector.c:124
[1]PETSC ERROR: #6 VecAssemblyEnd_MPI_BTS() at
/home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:337
[1]PETSC ERROR: #7 VecAssemblyEnd() at
/home/edo/software/petsc-3.18.0/src/vec/vec/interface/vector.c:158
[1]PETSC ERROR: #8 VecView() at
/home/edo/software/petsc-3.18.0/src/vec/vec/interface/vector.c:719
[0]PETSC ERROR: #8 VecAssemblyBegin() at
/home/edo/software/petsc-3.18.0/src/vec/vec/interface/vector.c:124
Vec Object: 2 MPI processes
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20221104/b28ada3d/attachment-0001.html>


More information about the petsc-users mailing list