<div dir="ltr"><div dir="ltr">IIRC, MPI_CHAR is for ASCII text data. Also, remember that in C the signedness of plain `char` is implementation (or platform?) dependent.<div><div> I'm not sure MPI_Reduce() is supposed to / should handle MPI_CHAR, you should use MPI_{SIGNED|UNSIGNED}_CHAR for that. Note however that MPI_SIGNED_CHAR is from MPI 2.0.<br></div></div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, 3 Apr 2019 at 07:01, Fande Kong via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr">Hi All,<div><br></div><div>There were some error messages when using PetscSFReduceBegin with MPI_CHAR. </div><div><br></div><div>ierr = PetscSFReduceBegin(ptap->sf,MPI_CHAR,rmtspace,space,MPI_SUM);CHKERRQ(ierr);<br></div><div><br></div><div><br></div><div>My question would be: Does PetscSFReduceBegin suppose work with MPI_CHAR? If not, should we document somewhere?</div><div><br></div><div>Thanks</div><div><br></div><div>Fande,</div><div><br></div><div><br></div><div><div>[<i>0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------</i></div><div><i>[0]PETSC ERROR: No support for this operation for this object type</i></div><div><i>[0]PETSC ERROR: No support for type size not divisible by 4</i></div><div><i>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.</i></div><div><i>[0]PETSC ERROR: Petsc Development GIT revision: v3.10.4-1989-gd816d1587e GIT Date: 2019-04-02 17:37:18 -0600</i></div><div><i>[0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------</i></div><div><i>[1]PETSC ERROR: No support for this operation for this object type</i></div><div><i>[1]PETSC ERROR: No support for type size not divisible by 4</i></div><div><i>[1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.</i></div><div><i>[1]PETSC ERROR: Petsc Development GIT revision: v3.10.4-1989-gd816d1587e GIT Date: 2019-04-02 17:37:18 -0600</i></div><div><i>[1]PETSC ERROR: ./ex90 on a arch-linux2-c-dbg-feature-ptap-all-at-once named fn605731.local by kongf Tue Apr 2 21:48:41 2019</i></div><div><i>[1]PETSC ERROR: Configure options --download-hypre=1 --with-debugging=yes --with-shared-libraries=1 --download-fblaslapack=1 --download-metis=1 --download-parmetis=1 --download-superlu_dist=1 PETSC_ARCH=arch-linux2-c-dbg-feature-ptap-all-at-once --download-ptscotch --download-party --download-chaco --with-cxx-dialect=C++11</i></div><div><i>[1]PETSC ERROR: #1 PetscSFBasicPackTypeSetup() line 678 in /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c</i></div><div><i>[1]PETSC ERROR: #2 PetscSFBasicGetPack() line 804 in /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c</i></div><div><i>[1]PETSC ERROR: #3 PetscSFReduceBegin_Basic() line 1024 in /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c</i></div><div><i>./ex90 on a arch-linux2-c-dbg-feature-ptap-all-at-once named fn605731.local by kongf Tue Apr 2 21:48:41 2019</i></div><div><i>[0]PETSC ERROR: Configure options --download-hypre=1 --with-debugging=yes --with-shared-libraries=1 --download-fblaslapack=1 --download-metis=1 --download-parmetis=1 --download-superlu_dist=1 PETSC_ARCH=arch-linux2-c-dbg-feature-ptap-all-at-once --download-ptscotch --download-party --download-chaco --with-cxx-dialect=C++11</i></div><div><i>[0]PETSC ERROR: #1 PetscSFBasicPackTypeSetup() line 678 in /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c</i></div><div><i>[0]PETSC ERROR: #2 PetscSFBasicGetPack() line 804 in /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c</i></div><div><i>[0]PETSC ERROR: #3 PetscSFReduceBegin_Basic() line 1024 in /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c</i></div><div><i>[0]PETSC ERROR: #4 PetscSFReduceBegin() line 1208 in /Users/kongf/projects/petsc/src/vec/is/sf/interface/sf.c</i></div><div><i>[0]PETSC ERROR: #5 MatPtAPNumeric_MPIAIJ_MPIAIJ_allatonce() line 850 in /Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c</i></div><div><i>[0]PETSC ERROR: #6 MatPtAP_MPIAIJ_MPIAIJ() line 202 in /Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c</i></div><div><i>[0]PETSC ERROR: #7 MatPtAP() line 9429 in /Users/kongf/projects/petsc/src/mat/interface/matrix.c</i></div><div><i>[0]PETSC ERROR: #8 main() line 58 in /Users/kongf/projects/petsc/src/mat/examples/tests/ex90.c</i></div><div><i>[0]PETSC ERROR: PETSc Option Table entries:</i></div><div><i>[0]PETSC ERROR: -matptap_via allatonce</i></div><div><i>[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------</i></div><div><i>[1]PETSC ERROR: #4 PetscSFReduceBegin() line 1208 in /Users/kongf/projects/petsc/src/vec/is/sf/interface/sf.c</i></div><div><i>[1]PETSC ERROR: #5 MatPtAPNumeric_MPIAIJ_MPIAIJ_allatonce() line 850 in /Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c</i></div><div><i>[1]PETSC ERROR: #6 MatPtAP_MPIAIJ_MPIAIJ() line 202 in /Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c</i></div><div><i>[1]PETSC ERROR: #7 MatPtAP() line 9429 in /Users/kongf/projects/petsc/src/mat/interface/matrix.c</i></div><div><i>[1]PETSC ERROR: #8 main() line 58 in /Users/kongf/projects/petsc/src/mat/examples/tests/ex90.c</i></div><div><i>[1]PETSC ERROR: PETSc Option Table entries:</i></div><div><i>[1]PETSC ERROR: -matptap_via allatonce</i></div><div><i>[1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------</i></div><div><i>--------------------------------------------------------------------------</i></div><div><i>MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD</i></div><div><i>with errorcode 56.</i></div><div><i><br></i></div><div><i>NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.</i></div><div><i>You may or may not see output from other processes, depending on</i></div><div><i>exactly when Open MPI kills them.</i></div><div><i>--------------------------------------------------------------------------</i></div><div><i>[fn605731.local:78133] 1 more process has sent help message help-mpi-api.txt / mpi-abort</i></div><div><i>[fn605731.local:78133] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages</i></div></div></div></div></div></div></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div>Lisandro Dalcin<br>============<br>Research Scientist<br>Extreme Computing Research Center (ECRC)<br>King Abdullah University of Science and Technology (KAUST)<br><a href="http://ecrc.kaust.edu.sa/" target="_blank">http://ecrc.kaust.edu.sa/</a><br></div></div></div>