[petsc-users] PetscSFReduceBegin can not handle MPI_CHAR?
Zhang, Junchao
jczhang at mcs.anl.gov
Wed Apr 3 22:40:35 CDT 2019
On Wed, Apr 3, 2019, 10:29 PM Fande Kong <fdkong.jd at gmail.com<mailto:fdkong.jd at gmail.com>> wrote:
Thanks for the reply. It is not necessary for me to use MPI_SUM. I think the better choice is MPIU_REPLACE. Doesn’t MPIU_REPLACE work for any mpi_datatype?
Yes.
Fande
On Apr 3, 2019, at 9:15 PM, Zhang, Junchao <jczhang at mcs.anl.gov<mailto:jczhang at mcs.anl.gov>> wrote:
On Wed, Apr 3, 2019 at 3:41 AM Lisandro Dalcin via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
IIRC, MPI_CHAR is for ASCII text data. Also, remember that in C the signedness of plain `char` is implementation (or platform?) dependent.
I'm not sure MPI_Reduce() is supposed to / should handle MPI_CHAR, you should use MPI_{SIGNED|UNSIGNED}_CHAR for that. Note however that MPI_SIGNED_CHAR is from MPI 2.0.
MPI standard chapter 5.9.3, says "MPI_CHAR, MPI_WCHAR, and MPI_CHARACTER (which represent printable characters) cannot be used in reduction operations"
So Fande's code and Jed's branch have problems. To fix that, we have to add support for signed char, unsigned char, and char in PetscSF. The first two types support add, mult, logical and bitwise operations. The last is a dumb type, only supports pack/unpack. With this fix, PetscSF/MPI would raise error on Fande's code. I can come up with a fix tomorrow.
On Wed, 3 Apr 2019 at 07:01, Fande Kong via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
Hi All,
There were some error messages when using PetscSFReduceBegin with MPI_CHAR.
ierr = PetscSFReduceBegin(ptap->sf,MPI_CHAR,rmtspace,space,MPI_SUM);CHKERRQ(ierr);
My question would be: Does PetscSFReduceBegin suppose work with MPI_CHAR? If not, should we document somewhere?
Thanks
Fande,
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: No support for this operation for this object type
[0]PETSC ERROR: No support for type size not divisible by 4
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.10.4-1989-gd816d1587e GIT Date: 2019-04-02 17:37:18 -0600
[0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[1]PETSC ERROR: No support for this operation for this object type
[1]PETSC ERROR: No support for type size not divisible by 4
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[1]PETSC ERROR: Petsc Development GIT revision: v3.10.4-1989-gd816d1587e GIT Date: 2019-04-02 17:37:18 -0600
[1]PETSC ERROR: ./ex90 on a arch-linux2-c-dbg-feature-ptap-all-at-once named fn605731.local by kongf Tue Apr 2 21:48:41 2019
[1]PETSC ERROR: Configure options --download-hypre=1 --with-debugging=yes --with-shared-libraries=1 --download-fblaslapack=1 --download-metis=1 --download-parmetis=1 --download-superlu_dist=1 PETSC_ARCH=arch-linux2-c-dbg-feature-ptap-all-at-once --download-ptscotch --download-party --download-chaco --with-cxx-dialect=C++11
[1]PETSC ERROR: #1 PetscSFBasicPackTypeSetup() line 678 in /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[1]PETSC ERROR: #2 PetscSFBasicGetPack() line 804 in /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[1]PETSC ERROR: #3 PetscSFReduceBegin_Basic() line 1024 in /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
./ex90 on a arch-linux2-c-dbg-feature-ptap-all-at-once named fn605731.local by kongf Tue Apr 2 21:48:41 2019
[0]PETSC ERROR: Configure options --download-hypre=1 --with-debugging=yes --with-shared-libraries=1 --download-fblaslapack=1 --download-metis=1 --download-parmetis=1 --download-superlu_dist=1 PETSC_ARCH=arch-linux2-c-dbg-feature-ptap-all-at-once --download-ptscotch --download-party --download-chaco --with-cxx-dialect=C++11
[0]PETSC ERROR: #1 PetscSFBasicPackTypeSetup() line 678 in /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[0]PETSC ERROR: #2 PetscSFBasicGetPack() line 804 in /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[0]PETSC ERROR: #3 PetscSFReduceBegin_Basic() line 1024 in /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[0]PETSC ERROR: #4 PetscSFReduceBegin() line 1208 in /Users/kongf/projects/petsc/src/vec/is/sf/interface/sf.c
[0]PETSC ERROR: #5 MatPtAPNumeric_MPIAIJ_MPIAIJ_allatonce() line 850 in /Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c
[0]PETSC ERROR: #6 MatPtAP_MPIAIJ_MPIAIJ() line 202 in /Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c
[0]PETSC ERROR: #7 MatPtAP() line 9429 in /Users/kongf/projects/petsc/src/mat/interface/matrix.c
[0]PETSC ERROR: #8 main() line 58 in /Users/kongf/projects/petsc/src/mat/examples/tests/ex90.c
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -matptap_via allatonce
[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov<mailto:petsc-maint at mcs.anl.gov>----------
[1]PETSC ERROR: #4 PetscSFReduceBegin() line 1208 in /Users/kongf/projects/petsc/src/vec/is/sf/interface/sf.c
[1]PETSC ERROR: #5 MatPtAPNumeric_MPIAIJ_MPIAIJ_allatonce() line 850 in /Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c
[1]PETSC ERROR: #6 MatPtAP_MPIAIJ_MPIAIJ() line 202 in /Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c
[1]PETSC ERROR: #7 MatPtAP() line 9429 in /Users/kongf/projects/petsc/src/mat/interface/matrix.c
[1]PETSC ERROR: #8 main() line 58 in /Users/kongf/projects/petsc/src/mat/examples/tests/ex90.c
[1]PETSC ERROR: PETSc Option Table entries:
[1]PETSC ERROR: -matptap_via allatonce
[1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov<mailto:petsc-maint at mcs.anl.gov>----------
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 56.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[fn605731.local:78133] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[fn605731.local:78133] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
--
Lisandro Dalcin
============
Research Scientist
Extreme Computing Research Center (ECRC)
King Abdullah University of Science and Technology (KAUST)
http://ecrc.kaust.edu.sa/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190404/7795441e/attachment-0001.html>
More information about the petsc-users
mailing list