[petsc-users] PetscCommGetNewTag issue

Fande Kong fd.kong at siat.ac.cn
Tue Jan 6 19:10:58 CST 2015


Hi all,

I make a very simple code, but run into some error messages.

Code:

#include <petsc.h>
static char help[] = " simple test.\n\n";

#undef __FUNCT__
#define __FUNCT__ "main"
int main(int argc,char **argv)
{
  PetscMPIInt                   tag = 0;
  PetscErrorCode                ierr;

  ierr = PetscInitialize(&argc,&argv,(char *)0,help);CHKERRQ(ierr);
  ierr = PetscCommGetNewTag(PETSC_COMM_WORLD,&tag);CHKERRQ(ierr);
  ierr = PetscPrintf(PETSC_COMM_SELF," tag %d \n", tag);CHKERRQ(ierr);
  ierr = PetscFinalize();CHKERRQ(ierr);
}


Run: mpirun -np 2 ./Test

Error messages:

[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Corrupt argument:
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: Bad MPI communicator supplied; must be a PETSc communicator
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.5.2, Sep, 08, 2014
[0]PETSC ERROR: ./Test on a arch-darwin-cxx-debug named
engr2-7-113-dhcp.int.colorado.edu by livia Tue Jan  6 18:07:27 2015
[0]PETSC ERROR: Configure options --with-clanguage=cxx
--with-shared-libraries=1 --download-fblaslapack=1
--with-mpi-dir=/Users/livia/math/mpich-3.1_install --download-parmetis=1
--download-metis=1 --with-64-bit-indices=1 --download-netcdf=1
--download-exodusii=1 --download-hdf5=1
--with-mpi-dir=/Users/livia/math/mpich-3.1_install
[0]PETSC ERROR: #1 PetscCommGetNewTag() line 85 in
/Users/livia/math/petsc-3.5.2/src/sys/objects/tagm.c
[1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: #2 main() line 19 in
/Users/livia/math/Spmcs/src/multigrid/test/Test.cpp
[0]PETSC ERROR: ----------------End of Error Message -------send entire
error message to petsc-maint at mcs.anl.gov----------
[1]PETSC ERROR: Corrupt argument:
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[1]PETSC ERROR: Bad MPI communicator supplied; must be a PETSc communicator
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.5.2, Sep, 08, 2014
[1]PETSC ERROR: ./Test on a arch-darwin-cxx-debug named
engr2-7-113-dhcp.int.colorado.edu by livia Tue Jan  6 18:07:27 2015
[1]PETSC ERROR: Configure options --with-clanguage=cxx
--with-shared-libraries=1 --download-fblaslapack=1
--with-mpi-dir=/Users/livia/math/mpich-3.1_install --download-parmetis=1
--download-metis=1 --with-64-bit-indices=1 --download-netcdf=1
--download-exodusii=1 --download-hdf5=1
--with-mpi-dir=/Users/livia/math/mpich-3.1_install
[1]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 64) - process 0
#1 PetscCommGetNewTag() line 85 in
/Users/livia/math/petsc-3.5.2/src/sys/objects/tagm.c
[1]PETSC ERROR: #2 main() line 19 in
/Users/livia/math/Spmcs/src/multigrid/test/Test.cpp
[cli_0]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 64) - process 0
[1]PETSC ERROR: ----------------End of Error Message -------send entire
error message to petsc-maint at mcs.anl.gov----------
application called MPI_Abort(MPI_COMM_WORLD, 64) - process 1
[cli_1]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 64) - process 1

===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   PID 1783 RUNNING AT engr2-7-113-dhcp.int.colorado.edu
=   EXIT CODE: 64
=   CLEANING UP REMAINING PROCESSES
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================


Fande
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150106/1d94fb34/attachment.html>


More information about the petsc-users mailing list