[petsc-users] About DMDA (and extracting its ordering)

Thibaut Appel t.appel17 at imperial.ac.uk
Mon Feb 25 06:15:19 CST 2019


Hi Matthew,

The file in the link you sent does not call DMGetLocalToGlobalMapping 
nor ISLocalToGlobalMappingGetIndicesF90 though?

Anyway, everything you wanted is there:

https://bitbucket.org/petsc/petsc/issues/262/fortran-unassociated-pointer-with

Thibaut


On 25/02/2019 12:03, Matthew Knepley wrote:
> On Mon, Feb 25, 2019 at 3:19 AM Appel, Thibaut 
> <t.appel17 at imperial.ac.uk <mailto:t.appel17 at imperial.ac.uk>> wrote:
>
>     Hi Matthew,
>
>     Yes I need F90 and the syntax in the file / yours
>>     PetscInt, pointer :: id_ltog(:)
>
>     Is the exact same as
>>     PetscInt, dimension(:), pointer :: id_ltog
>
>     They’re both modern fortran ”correct”
>
>     Anyways I tried both and I still get the same error message.
>
>
> Okay, that file I sent the link to is tested every night, so I suspect 
> a compiler bug in your version.
> Please send your entire example so I can run it here. Also, let us 
> know what Fortran compiler
> and version you are using again.
>
>   Thanks,
>
>      Matt
>
>     Thibaut
>
>     On 24 Feb 2019, at 23:38, Matthew Knepley <knepley at gmail.com
>     <mailto:knepley at gmail.com>> wrote:
>
>>     On Sun, Feb 24, 2019 at 6:33 PM Appel, Thibaut
>>     <t.appel17 at imperial.ac.uk <mailto:t.appel17 at imperial.ac.uk>> wrote:
>>
>>         Is that for the id_ltog argument? I tried declaring it as IS,
>>         and you can’t declare an deferred-shape array as target. Same
>>         message. XXX(:) is syntactic sugar for dimension(:) :: XXX
>>
>>         I’m really confused because this
>>         https://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex14f.F90.html
>>         doesn’t use the xxxF90 variants of the routines we’re talking
>>         about?
>>
>>
>>     You can use the F77 stuff if you want. I thought you needed F90. 
>>     For F90, I think you need to declare it
>>
>>       PetscInt, pointer :: id_ltog(:)
>>
>>     like it is in the link I sent you.
>>
>>       Thanks,
>>
>>          Matt
>>
>>         Thibaut
>>
>>>         On 24 Feb 2019, at 22:50, Matthew Knepley <knepley at gmail.com
>>>         <mailto:knepley at gmail.com>> wrote:
>>>
>>>         On Sun, Feb 24, 2019 at 4:28 PM Appel, Thibaut
>>>         <t.appel17 at imperial.ac.uk <mailto:t.appel17 at imperial.ac.uk>>
>>>         wrote:
>>>
>>>             Hi Matthew,
>>>
>>>             With the following data declaration and routines calls:
>>>
>>>
>>>         I am not a Fortran90 expert, but your declaration looks
>>>         different than the ones I have seen work:
>>>
>>>         https://bitbucket.org/petsc/petsc/src/master/src/dm/impls/plex/examples/tutorials/ex1f90.F90
>>>
>>>           Thanks,
>>>
>>>              Matt
>>>
>>>
>>>               DM :: da
>>>             ISLocalToGlobalMapping :: ltog
>>>               PetscInt, DIMENSION(:), POINTER :: id_ltog
>>>
>>>               CALL
>>>             DMDACreate2D(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,(nx+1),(ny+1),
>>>             &
>>>               PETSC_DECIDE,PETSC_DECIDE,neq,0,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,da,ierr)
>>>               CHKERRA(ierr)
>>>
>>>               CALL DMSetUp(da,ierr)
>>>               CHKERRA(ierr)
>>>
>>>               CALL
>>>             DMDAGetCorners(da,icx,icy,PETSC_NULL_INTEGER,lx,ly,PETSC_NULL_INTEGER,ierr)
>>>               CHKERRA(ierr)
>>>
>>>               CALL DMGetLocalToGlobalMapping(da,ltog,ierr)
>>>               CHKERRA(ierr)
>>>
>>>               CALL
>>>             ISLocalToGlobalMappingGetIndicesF90(ltog,id_ltog,ierr)
>>>               CHKERRA(ierr)
>>>
>>>               CALL
>>>             ISLocalToGlobalMappingRestoreIndicesF90(ltog,id_ltog,ierr)
>>>               CHKERRA(ierr)
>>>
>>>               CALL DMDestroy(da,ierr)
>>>               CHKERRA(ierr)
>>>
>>>
>>>             I get, with the most recent Intel Fortran compiler and a
>>>             fresh “git pull” PETsc:
>>>
>>>             forrtl: severe (408): fort: (7): Attempt to use pointer
>>>             ID_LTOG when it is not associated with a target
>>>
>>>
>>>             Thibaut
>>>
>>>>             On 22 Feb 2019, at 15:13, Matthew Knepley
>>>>             <knepley at gmail.com <mailto:knepley at gmail.com>> wrote:
>>>>
>>>>             On Fri, Feb 22, 2019 at 9:10 AM Thibaut Appel via
>>>>             petsc-users <petsc-users at mcs.anl.gov
>>>>             <mailto:petsc-users at mcs.anl.gov>> wrote:
>>>>             I reckon that what corresponds best to my situation is
>>>>             to create my own MPIAIJ matrix, recover the DMDA local
>>>>             to global mapping, preallocate myself and fill my
>>>>             values with MatSetValues.
>>>>
>>>>             However, what is the correct way to
>>>>             call ISLocalToGlobalMappingGetIndices in Fortran? It
>>>>             seems there's something missing in the documentation as
>>>>             my code just won't work.
>>>>
>>>>             Some examples include a PetscOffset argument which is
>>>>             not present in the documentation and there is no manual
>>>>             page for ISLocalToGlobalMappingGetIndicesF90: the only
>>>>             information I could find is on
>>>>             /src/vec/is/utils/f90-custom/zisltogf90.c. Are
>>>>             you supposed to allocate the index array yourself? I
>>>>             tried declaring it as a pointer too. I keep getting an
>>>>             error
>>>>
>>>>             1) There was a bug in that file, but I don't think it
>>>>             should have affected you. I will put in a PR
>>>>
>>>>             2)  You are supposed to pass in a pointer, unallocated.
>>>>             We manage all the allocation, which is why you call
>>>>             Restore. What happens when you do that?
>>>>
>>>>               Thanks,
>>>>
>>>>                  Matt
>>>>             [0]PETSC ERROR: #1 User provided function() line 0 in
>>>>             User file
>>>>             application called MPI_Abort(MPI_COMM_SELF, -42924) -
>>>>             process 0
>>>>
>>>>             Minimal example:
>>>>
>>>>             PROGRAM test_dmda
>>>>
>>>>             #include <petsc/finclude/petscdm.h>
>>>>             #include <petsc/finclude/petscdmda.h>
>>>>
>>>>               USE PetscDMDA
>>>>
>>>>               IMPLICIT NONE
>>>>
>>>>               PetscErrorCode :: ierr
>>>>               PetscInt       :: irank, icx, icy, lx, ly, n_ltog
>>>>               DM             :: da
>>>>             ISLocalToGlobalMapping :: ltog
>>>>
>>>>               PetscInt, DIMENSION(:), ALLOCATABLE :: id_ltog
>>>>               !PetscInt, DIMENSION(:), POINTER :: id_ltog
>>>>
>>>>               PetscInt, PARAMETER :: nx=24, ny=9, neq=4
>>>>               CHARACTER(LEN=100) :: err_msg
>>>>
>>>>               CALL PetscInitialize(PETSC_NULL_CHARACTER,ierr)
>>>>
>>>>               CALL MPI_COMM_RANK(PETSC_COMM_WORLD,irank,ierr)
>>>>
>>>>             CALL DMDACreate2D(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,(nx+1),(ny+1), &
>>>>               PETSC_DECIDE,PETSC_DECIDE,neq,0,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,da,ierr)
>>>>               CHKERRA(ierr)
>>>>
>>>>               CALL DMSetUp(da,ierr)
>>>>               CHKERRA(ierr)
>>>>
>>>>             CALL DMDAGetCorners(da,icx,icy,PETSC_NULL_INTEGER,lx,ly,PETSC_NULL_INTEGER,ierr)
>>>>               CHKERRA(ierr)
>>>>
>>>>             WRITE(*,'(*(1X,A,I3))') 'I am process #', irank, ' - my
>>>>             lower left corner is icx=', icx, &
>>>>                   'and icy=', icy, ' - the width is lx=', lx, ' and
>>>>             ly=', ly
>>>>
>>>>               CALL DMGetLocalToGlobalMapping(da,ltog,ierr)
>>>>               CHKERRA(ierr)
>>>>
>>>>               CALL ISLocalToGlobalMappingGetSize(ltog,n_ltog,ierr)
>>>>               CHKERRA(ierr)
>>>>
>>>>             ALLOCATE(id_ltog(1:n_ltog),STAT=ierr,ERRMSG=err_msg)
>>>>
>>>>               CALL ISLocalToGlobalMappingGetIndices(ltog,id_ltog,ierr)
>>>>               CHKERRA(ierr)
>>>>
>>>>               CALL
>>>>             ISLocalToGlobalMappingRestoreIndices(ltog,id_ltog,ierr)
>>>>               CHKERRA(ierr)
>>>>
>>>>             DEALLOCATE(id_ltog,STAT=ierr,ERRMSG=err_msg)
>>>>
>>>>               CALL DMDestroy(da,ierr)
>>>>               CHKERRA(ierr)
>>>>
>>>>               CALL PetscFinalize(ierr)
>>>>
>>>>             END PROGRAM test_dmda
>>>>
>>>>
>>>>
>>>>             On 21/02/2019 17:49, Matthew Knepley wrote:
>>>>>             On Thu, Feb 21, 2019 at 11:16 AM Thibaut Appel via
>>>>>             petsc-users <petsc-users at mcs.anl.gov
>>>>>             <mailto:petsc-users at mcs.anl.gov>> wrote:
>>>>>             Dear PETSc developers/users,
>>>>>
>>>>>             I’m solving linear PDEs on a regular grid with
>>>>>             high-order finite differences, assembling an MPIAIJ
>>>>>             matrix to solve linear systems or eigenvalue problems.
>>>>>             I’ve been using vertex major, natural ordering for
>>>>>             the parallelism with PetscSplitOwnership (yielding
>>>>>             rectangular slices of the physical domain) and wanted
>>>>>             to move to DMDA to have a more square-ish domain
>>>>>             decomposition and minimize communication between
>>>>>             processes.
>>>>>
>>>>>             However, my application is memory critical, and I have
>>>>>             finely-tuned matrix preallocation routines for
>>>>>             allocating memory “optimally”. It seems the memory of
>>>>>             a DMDA matrix is allocated along the value of
>>>>>             the stencil width of DMDACreate and the manual says
>>>>>             about it
>>>>>
>>>>>             “These DMDA stencils have nothing directly to do with
>>>>>             any finite difference stencils one might chose to use
>>>>>             for a discretization”
>>>>>
>>>>>             And despite reading the manual pages there must be
>>>>>             something I do not understand in the DM topology, what
>>>>>             is that "stencil width" for then? I will not use ghost
>>>>>             values for my FD-method, right?
>>>>>
>>>>>             What this is saying is, "You might be using
>>>>>             some stencil that is not STAR or BOX, but we are
>>>>>             preallocating according to one of those".
>>>>>             If you really care about how much memory
>>>>>             is preallocated, which it seems you do, then you might
>>>>>             be able to use
>>>>>
>>>>>             https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DMDA/DMDASetBlockFills.html
>>>>>
>>>>>             to tell use exactly how to preallocate.
>>>>>
>>>>>             I was then wondering if I could just create a MPIAIJ
>>>>>             matrix, and with a PETSc routine get the global
>>>>>             indices of the domain for each process: in other
>>>>>             words, an equivalent of PetscSplitOwnership
>>>>>             that gives me the DMDA unknown ordering. So I can feed
>>>>>             and loop on that in my preallocation and
>>>>>             assembly routines.
>>>>>
>>>>>             You can make an MPIAIJ matrix yourself of course.
>>>>>             It should have the same division of rows as the DMDA
>>>>>             division of dofs. Also, MatSetValuesStencil() will not
>>>>>             work for a custom matrix.
>>>>>
>>>>>               Thanks,
>>>>>
>>>>>                  Matt
>>>>>
>>>>>             Thanks very much,
>>>>>
>>>>>             Thibaut
>>>>>
>>>>>
>>>>>             -- 
>>>>>             What most experimenters take for granted before they
>>>>>             begin their experiments is infinitely more interesting
>>>>>             than any results to which their experiments lead.
>>>>>             -- Norbert Wiener
>>>>>
>>>>>             https://www.cse.buffalo.edu/~knepley/
>>>>
>>>>
>>>>             -- 
>>>>             What most experimenters take for granted before they
>>>>             begin their experiments is infinitely more interesting
>>>>             than any results to which their experiments lead.
>>>>             -- Norbert Wiener
>>>>
>>>>             https://www.cse.buffalo.edu/~knepley/
>>>
>>>
>>>
>>>         -- 
>>>         What most experimenters take for granted before they begin
>>>         their experiments is infinitely more interesting than any
>>>         results to which their experiments lead.
>>>         -- Norbert Wiener
>>>
>>>         https://www.cse.buffalo.edu/~knepley/
>>>         <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>>
>>     -- 
>>     What most experimenters take for granted before they begin their
>>     experiments is infinitely more interesting than any results to
>>     which their experiments lead.
>>     -- Norbert Wiener
>>
>>     https://www.cse.buffalo.edu/~knepley/
>>     <http://www.cse.buffalo.edu/~knepley/>
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ 
> <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190225/9ab2c957/attachment-0001.html>


More information about the petsc-users mailing list