[petsc-users] Fortran example for PETSc global to natural ordering test
Danyang Su
danyang.su at gmail.com
Tue Dec 4 14:37:17 CST 2018
On 2018-12-04 11:06 a.m., Matthew Knepley wrote:
> On Mon, Dec 3, 2018 at 8:32 PM Danyang Su <danyang.su at gmail.com
> <mailto:danyang.su at gmail.com>> wrote:
>
> Hi Matt,
>
> Attached is the test example with source code, makefile, data and
> screen output I wrote this afternoon. This example reads 2D mesh
> from vtk file and then distribute over all processors. I can get
> correct global order of local nodes after distribution, but the
> natural order of local nodes is always zero after using
> DMPlexGlobalToNaturalBegin/End(). Sounds like it does not take
> effect.
>
> Would you mind helping me to check/test this code at you most
> convenience?
>
> On 2 procs, I get an SEGV
>
> rank1 local nodes with ghost 47 local cells with ghost 75
>
> rank0 local nodes with ghost 52 local cells with ghost 78
>
Interesting. The output on your side is different. On my computer, run
the code with 2 processors, I get
rank 0 local nodes with ghost 47 local cells with ghost 76
rank 1 local nodes with ghost 49 local cells with ghost 74
and there is no error. I can run this example using 1 to 5 processors
without error. With 6 or more processors, it returns some error as shown
below.
Error in `./natural': corrupted size vs. prev_size: 0x0000000001b9a600
This seems reasonable if the mesh is too small with overlap = 1.
The version I use is
Petsc Development GIT revision: v3.10.2-832-ge28ad50 GIT Date:
2018-12-02 07:18:52 +0100
Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran
--download-mpich --download-scalapack --download-parmetis
--download-metis --download-ptscotch --download-fblaslapack
--download-hypre --download-superlu_dist --download-hdf5=yes
--download-ctetgen --download-zlib --download-netcdf --download-pnetcdf
--download-exodusii --download-netcdf --with-debugging=1
Thanks,
Danyang
> [1]PETSC ERROR:
> ------------------------------------------------------------------------
>
> [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
>
> [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
>
> [1]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>
> [1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
> OS X to find memory corruption errors
>
> [1]PETSC ERROR: likely location of problem given in stack below
>
> [1]PETSC ERROR: ---------------------Stack Frames
> ------------------------------------
>
> [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
>
> [1]PETSC ERROR: INSTEAD the line number of the start of the function
>
> [1]PETSC ERROR: is given.
>
> [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
>
> [1]PETSC ERROR: Signal received
>
> [1]PETSC ERROR: See
> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
>
> [1]PETSC ERROR: Petsc Development GIT revision:
> v3.9.3-1021-g8625415GIT Date: 2018-08-02 12:57:14 -0500
>
> [1]PETSC ERROR: Unknown Name on a arch-master-debug named
> MATTHEW-KNEPLEYs-MacBook-Air-2.local by knepley Tue Dec4 14:00:46 2018
>
> [1]PETSC ERROR: Configure options --PETSC_ARCH=arch-master-debug
> --download-chaco
> --download-cmake=/Users/knepley/Downloads/cmake-3.7.2.tar.gz
> --download-ctetgen --download-eigen --download-fftw --download-hdf5
> --download-med --download-metis --download-mpich --download-netcdf
> --download-p4est --download-parmetis --download-pnetcdf
> --download-superlu_dist --download-triangle
> --with-cc="/Users/knepley/MacSoftware/bin/ccache gcc
> -Qunused-arguments" --with-cxx="/Users/knepley/MacSoftware/bin/ccache
> g++ -Qunused-arguments"
> --with-fc="/Users/knepley/MacSoftware/bin/ccache gfortran"
> --with-shared-libraries
>
> [1]PETSC ERROR: #1 User provided function() line 0 inunknown file
>
> [0]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) -
> process 1
>
> ------------------------------------------------------------------------
>
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
>
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
>
> [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>
> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
> OS X to find memory corruption errors
>
> [0]PETSC ERROR: likely location of problem given in stack below
>
> [0]PETSC ERROR: ---------------------Stack Frames
> ------------------------------------
>
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
>
> [0]PETSC ERROR: INSTEAD the line number of the start of the function
>
> [0]PETSC ERROR: is given.
>
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
>
> [0]PETSC ERROR: Signal received
>
> [0]PETSC ERROR: See
> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
>
> [0]PETSC ERROR: Petsc Development GIT revision:
> v3.9.3-1021-g8625415GIT Date: 2018-08-02 12:57:14 -0500
>
> [0]PETSC ERROR: Unknown Name on a arch-master-debug named
> MATTHEW-KNEPLEYs-MacBook-Air-2.local by knepley Tue Dec4 14:00:46 2018
>
> [0]PETSC ERROR: Configure options --PETSC_ARCH=arch-master-debug
> --download-chaco
> --download-cmake=/Users/knepley/Downloads/cmake-3.7.2.tar.gz
> --download-ctetgen --download-eigen --download-fftw --download-hdf5
> --download-med --download-metis --download-mpich --download-netcdf
> --download-p4est --download-parmetis --download-pnetcdf
> --download-superlu_dist --download-triangle
> --with-cc="/Users/knepley/MacSoftware/bin/ccache gcc
> -Qunused-arguments" --with-cxx="/Users/knepley/MacSoftware/bin/ccache
> g++ -Qunused-arguments"
> --with-fc="/Users/knepley/MacSoftware/bin/ccache gfortran"
> --with-shared-libraries
>
> [0]PETSC ERROR: #1 User provided function() line 0 inunknown file
>
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
>
> which valgrind says comes from DMPlexCreateSection()
>
> rank0 local nodes with ghost 52 local cells with ghost 78
>
> rank1 local nodes with ghost 47 local cells with ghost 75
>
> ==14766== Invalid write of size 4
>
> ==14766==at 0x1151A3B: dmplexcreatesection_ (zplexsectionf90.c:20)
>
> ==14766==by 0x1000042FA: MAIN__ (in ./natural)
>
> ==14766==by 0x10000497D: main (in ./natural)
>
> ==14766==Address 0x600001000 is not stack'd, malloc'd or (recently) free'd
>
> ==14766==
>
> ==14767== Invalid write of size 4
>
> ==14767==at 0x1151A3B: dmplexcreatesection_ (zplexsectionf90.c:20)
>
> ==14767==by 0x1000042FA: MAIN__ (in ./natural)
>
> ==14767==by 0x10000497D: main (in ./natural)
>
> ==14767==Address 0x600001000 is not stack'd, malloc'd or (recently) free'd
>
> ==14767==
>
> I cannot run the debugger on it because command line arguments are not
> working with
> my Fortran compiler (Ugh). Do you see this error?
>
> Thanks,
>
> Matt
>
> Thanks,
>
> Danyang
>
> On 2018-12-03 1:12 p.m., Danyang Su wrote:
>>
>>
>> On 2018-12-03 12:56 p.m., Matthew Knepley wrote:
>>> On Mon, Dec 3, 2018 at 3:40 PM Danyang Su <danyang.su at gmail.com
>>> <mailto:danyang.su at gmail.com>> wrote:
>>>
>>>
>>> On 2018-12-03 12:03 p.m., Matthew Knepley wrote:
>>>> On Mon, Dec 3, 2018 at 2:27 PM Danyang Su
>>>> <danyang.su at gmail.com <mailto:danyang.su at gmail.com>> wrote:
>>>>
>>>> Hi Matt,
>>>>
>>>> Thanks.
>>>>
>>>> BTW: DmPlexGetVertexNumbering now can work using the
>>>> latest develop version. But the index is not in natural
>>>> ordering when DMSetUseNatural is called. That's why I
>>>> want to use PetscSFDistributeSection to check if I miss
>>>> anything in the code.
>>>>
>>>> Can you explain that a little more? Maybe you can just push
>>>> forward what you want using the migrationSF.
>>>
>>> Hi Matt,
>>>
>>> Since I cannot figure what is wrong or missing in my code, I
>>> followed an old ex26.c example in
>>> src/dm/impls/plex/examples/tests to create similar code as
>>> shown below to test global to natural ordering. The code may
>>> be ugly with unnecessary functions in it. Using
>>> DmPlexGetVertexNumbering, I can get the value but it is not
>>> in natural order, instead, it is still in default PETSc
>>> order without calling DMSetUseNatural(dm,PETSC_TRUE,ierr).
>>>
>>> I do not understand what you are doing below. You just need to call
>>>
>>> ierr = DMSetUseNatural(dm,PETSC_TRUE);CHKERRQ(ierr);
>>> ierr = DMPlexDistribute(dm,0,&migrationSF,&pdm);CHKERRQ(ierr);
>>> if (pdm) {
>>> ierr = DMPlexSetMigrationSF(pdm,migrationSF);CHKERRQ(ierr);
>>> }
>>> and the DMGlobalToNaturalBegin/End() should work.
>>
>> You mean to use DMPlexGlobalToNaturalBegin/End(), right? That's
>> what I tried at first, but without success.
>>
>> I will create a test example to make further check if I can
>> reproduce the problem.
>>
>> Thanks,
>>
>> Danyang
>>
>>>
>>> Thanks,
>>>
>>> Matt
>>>
>>> if (rank == 0) then
>>>
>>> call
>>> DMPlexCreateFromCellList(Petsc_Comm_World,ndim,num_cells,
>>> num_nodes,num_nodes_per_cell, &
>>> Petsc_False,dmplex_cells,ndim, dmplex_verts,dm,ierr)
>>> CHKERRQ(ierr)
>>> else
>>> call
>>> DMPlexCreateFromCellList(Petsc_Comm_World,ndim,0,
>>> 0,num_nodes_per_cell, &
>>> Petsc_False,dmplex_cells,ndim,dmplex_verts,dm,ierr)
>>> CHKERRQ(ierr)
>>> end if
>>>
>>> if (nprocs > 1) then
>>> call DMSetUseNatural(dm,PETSC_TRUE,ierr)
>>> CHKERRQ(ierr)
>>> end if
>>>
>>> call DMPlexDistribute(dm,stencil_width, &
>>> migrationsf,distributedMesh,ierr)
>>> CHKERRQ(ierr)
>>>
>>> if (distributedMesh /= PETSC_NULL_DM) then
>>> call
>>> PetscSFCreateInverseSF(migrationsf,migrationsf_inv,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call
>>> DMCreateGlobalToNatural(distributedMesh,migrationsf,migrationsf_inv,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call DMGetSection(distributedMesh,section,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call
>>> PetscSectionCreate(Petsc_Comm_World,section_seq,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call
>>> PetscSFDistributeSection(migrationsf_inv,section, &
>>> PETSC_NULL_INTEGER,section_seq,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call
>>> DMPlexCreateGlobalToNaturalSF(distributedMesh, &
>>> section_seq,migrationsf,sf_natural,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call
>>> DMSetUseNatural(distributedMesh,PETSC_TRUE,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call PetscSFDestroy(migrationsf,ierr)
>>> CHKERRQ(ierr)
>>>
>>> call PetscSFDestroy(migrationsf_inv,ierr)
>>> CHKERRQ(ierr)
>>>
>>> end if
>>>
>>> Thanks,
>>>
>>> Danyang
>>>
>>>>
>>>> Thanks,
>>>>
>>>> Matt
>>>>
>>>> Regards,
>>>>
>>>> Danyang
>>>>
>>>> On 2018-12-03 5:22 a.m., Matthew Knepley wrote:
>>>>> I need to write a custom Fortran stub for this one. I
>>>>> will get it done as soon as possible.
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Matt
>>>>>
>>>>> On Sat, Dec 1, 2018 at 7:16 PM Danyang Su via
>>>>> petsc-users <petsc-users at mcs.anl.gov
>>>>> <mailto:petsc-users at mcs.anl.gov>> wrote:
>>>>>
>>>>> Hi All,
>>>>>
>>>>> I got a simple compilation error when use
>>>>> PetscSFDistributeSection in
>>>>> Fortran. It looks like the required head files are
>>>>> included and the
>>>>> parameters are correctly defined. However, when
>>>>> compile the code, I got
>>>>> error undefined reference to
>>>>> `petscsfdistributesection_'. The code is
>>>>> shown below. Did I miss anything here?
>>>>>
>>>>> #include <petsc/finclude/petscsys.h>
>>>>> #include <petsc/finclude/petscvec.h>
>>>>> #include <petsc/finclude/petscdm.h>
>>>>> #include <petsc/finclude/petscdmplex.h>
>>>>> use petscsys
>>>>> use petscvec
>>>>> use petscdm
>>>>> use petscdmplex
>>>>>
>>>>> implicit none
>>>>>
>>>>> PetscSection :: section, section_seq
>>>>> PetscSF :: migrationsf_inv, sf_natural
>>>>> Vec :: vec_global, vec_natural
>>>>> PetscErrorCode :: ierr
>>>>>
>>>>> ...
>>>>>
>>>>> call
>>>>> PetscSFDistributeSection(migrationsf_inv,section,
>>>>> &
>>>>> PETSC_NULL_INTEGER,section_seq,ierr)
>>>>> CHKERRQ(ierr)
>>>>>
>>>>>
>>>>> call
>>>>> PetscSFDistributeSection(migrationsf_inv,section,
>>>>> &
>>>>> PETSC_NULL_INTEGER,section_seq,ierr)
>>>>> CHKERRQ(ierr)
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Danyang
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted before they
>>>>> begin their experiments is infinitely more interesting
>>>>> than any results to which their experiments lead.
>>>>> -- Norbert Wiener
>>>>>
>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin
>>>> their experiments is infinitely more interesting than any
>>>> results to which their experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>> https://www.cse.buffalo.edu/~knepley/
>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to
>>> which their experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181204/b9e798a9/attachment-0001.html>
More information about the petsc-users
mailing list