[petsc-users] DMPlexDistribute question

Matthew Knepley knepley at gmail.com
Tue Dec 31 11:44:57 CST 2013


On Tue, Dec 31, 2013 at 11:38 AM, Yaakoub El Khamra <yaakoub at tacc.utexas.edu
> wrote:

>
> snes ex12 also hangs. Then I started trying all other examples, all MPI
> examples hang. Basic MPI code and benchmarks appear to compile and run
> fine. I recompiled PETSc and went carefully over the output, there are a
> lot of warnings (even from a stable build)
>
> If I compile petsc with openmpi 1.7.3 and gcc 4.8.2 and the configure line:
>
> /configure --with-mpi-compilers=yes --download-metis --download-parmetis
> --download-fiat --with-fortran=yes --download-scientificpython
> --download-f-blas-lapack=1  --download-triangle --download-chaco
> --download-exodusii --download-netcdf --download-hdf5
>
> I get a lot of warnings from MPI that might significant. I can provide
> access to the fedora 20 box where I am working.
>

Either send login info to petsc-maint at mcs.anl.gov, or get a stack trace of
the hang using gdb.

  Thanks,

    Matt


> /home/yye00/Work/Saswata/petsc/src/sys/fileio/fretrieve.c:159:5: warning:
> ‘MPI_Keyval_create’ is deprecated (declared at
> /usr/include/openmpi-x86_64/mpi.h:1506): MPI_Keyval_create is superseded by
> MPI_Comm_create_keyval in MPI-2.0 [-Wdeprecated-declarations]
>      ierr =
> MPI_Keyval_create(MPI_NULL_COPY_FN,Petsc_DelTmpShared,&Petsc_Tmp_keyval,0);CHKERRQ(ierr);
>      ^
> /home/yye00/Work/Saswata/petsc/src/sys/fileio/fretrieve.c:159:5: warning:
> ‘OMPI_C_MPI_NULL_COPY_FN’ is deprecated (declared at
> /usr/include/openmpi-x86_64/mpi.h:828): MPI_NULL_COPY_FN is deprecated in
> MPI-2.0 [-Wdeprecated-declarations]
> /home/yye00/Work/Saswata/petsc/src/sys/fileio/fretrieve.c:162:3: warning:
> ‘MPI_Attr_get’ is deprecated (declared at
> /usr/include/openmpi-x86_64/mpi.h:1201): MPI_Attr_get is superseded by
> MPI_Comm_get_attr in MPI-2.0 [-Wdeprecated-declarations]
>    ierr =
> MPI_Attr_get(comm,Petsc_Tmp_keyval,(void**)&tagvalp,(int*)&iflg);CHKERRQ(ierr);
>    ^
> /home/yye00/Work/Saswata/petsc/src/sys/fileio/fretrieve.c:168:5: warning:
> ‘MPI_Attr_put’ is deprecated (declared at
> /usr/include/openmpi-x86_64/mpi.h:1203): MPI_Attr_put is superseded by
> MPI_Comm_set_attr in MPI-2.0 [-Wdeprecated-declarations]
>      ierr = MPI_Attr_put(comm,Petsc_Tmp_keyval,tagvalp);CHKERRQ(ierr);
>      ^
> /home/yye00/Work/Saswata/petsc/src/sys/fileio/fretrieve.c: In function
> ‘PetscSharedWorkingDirectory’:
> /home/yye00/Work/Saswata/petsc/src/sys/fileio/fretrieve.c:282:5: warning:
> ‘MPI_Keyval_create’ is deprecated (declared at
> /usr/include/openmpi-x86_64/mpi.h:1506): MPI_Keyval_create is superseded by
> MPI_Comm_create_keyval in MPI-2.0 [-Wdeprecated-declarations]
>      ierr =
> MPI_Keyval_create(MPI_NULL_COPY_FN,Petsc_DelTmpShared,&Petsc_WD_keyval,0);CHKERRQ(ierr);
>      ^
> /home/yye00/Work/Saswata/petsc/src/sys/fileio/fretrieve.c:282:5: warning:
> ‘OMPI_C_MPI_NULL_COPY_FN’ is deprecated (declared at
> /usr/include/openmpi-x86_64/mpi.h:828): MPI_NULL_COPY_FN is deprecated in
> MPI-2.0 [-Wdeprecated-declarations]
> /home/yye00/Work/Saswata/petsc/src/sys/fileio/fretrieve.c:285:3: warning:
> ‘MPI_Attr_get’ is deprecated (declared at
> /usr/include/openmpi-x86_64/mpi.h:1201): MPI_Attr_get is superseded by
> MPI_Comm_get_attr in MPI-2.0 [-Wdeprecated-declarations]
>    ierr =
> MPI_Attr_get(comm,Petsc_WD_keyval,(void**)&tagvalp,(int*)&iflg);CHKERRQ(ierr);
>    ^
> /home/yye00/Work/Saswata/petsc/src/sys/fileio/fretrieve.c:291:5: warning:
> ‘MPI_Attr_put’ is deprecated (declared at
> /usr/include/openmpi-x86_64/mpi.h:1203): MPI_Attr_put is superseded by
> MPI_Comm_set_attr in MPI-2.0 [-Wdeprecated-declarations]
>      ierr = MPI_Attr_put(comm,Petsc_WD_keyval,tagvalp);CHKERRQ(ierr);
>
>
>
> Regards
> Yaakoub El Khamra
>
>
> On Mon, Dec 30, 2013 at 10:58 PM, Matthew Knepley <knepley at gmail.com>wrote:
>
>> On Mon, Dec 30, 2013 at 7:34 PM, Yaakoub El Khamra <
>> yaakoub at tacc.utexas.edu> wrote:
>>
>>>
>>> If I create a dm with DMPlexCreateBoxMesh and immediately attempt to
>>> distribute it with DMPlexDistribute, the call to DMPlexDistribute never
>>> returns. I am working with a development checkout and the code looks as
>>> follows:
>>>
>>>       call DMPlexCreateBoxMesh(PETSC_COMM_WORLD, 2, PETSC_TRUE, dm,
>>>      $     ierr)
>>>       CHKERRQ(ierr)
>>>
>>>       call DMPlexDistribute(dm, "chaco", 0,
>>>      $ PETSC_NULL_OBJECT, distributedMesh, ierr)
>>>       CHKERRQ(ierr)
>>>
>>> Any thoughts?
>>>
>>
>> Can you run SNES ex12? It does exactly this and gets run in the nightly
>> tests. I can run it in parallel
>> with no problems. My guess is that something is going wrong with the
>> Fortran binding. Can you
>> get a stack trace for the hang? If not, can you send a full Fortran
>> program which exhibits your problem?
>>
>>    Matt
>>
>>
>>> Regards
>>> Yaakoub El Khamra
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131231/2a58bb38/attachment.html>


More information about the petsc-users mailing list