[petsc-users] PCSetCoordinates does not set coordinates of sub PC (fieldsplit) objects

Matthew Knepley knepley at gmail.com
Thu Jan 13 12:20:50 CST 2022


On Thu, Jan 13, 2022 at 1:15 PM Nicolás Barnafi <nabw91 at gmail.com> wrote:

> Dear all,
>
> I have created a first implementation. For now it must be called after
> setting the fields, eventually I would like to move it to the setup phase.
> The implementation seems clean, but it is giving me some memory errors
> (free() corrupted unsorted chunks).
>
> You may find the code below. After some work with gdb, I found out that
> the errors appears when calling the ISDestroy(&is_coords) line, which to me
> is not very clear, as I am indeed within the while scope creating and then
> destroying the is_coords object. I would greatly appreciate it if you could
> give me a hint on what the problem is.
>
> After debugging this, and working on your suggestion, I will open a PR.
>
> Best regards,
> NB
>
>
> ----- CODE ------
>
> static PetscErrorCode PCSetCoordinates_FieldSplit(PC pc, PetscInt dim,
> PetscInt nloc, PetscReal coords[])
> {
>   PetscErrorCode ierr;
>   PC_FieldSplit *jac = (PC_FieldSplit*)pc->data;
>   PC_FieldSplitLink ilink_current = jac->head;
>   PC pc_current;
>   PetscInt nmin, nmax, ii, ndofs;
>   PetscInt *owned_dofs; // Indexes owned by this processor
>   PetscReal *coords_block; // Coordinates to be given to the current PC
>   IS is_owned;
>
>   PetscFunctionBegin;
>   // Extract matrix ownership range to then compute subindexes for
> coordinates. This results in an IS object (is_owned).
>   // TODO: This would be simpler with a general MatGetOwnershipIS
> (currently supported only by Elemental and BLAS matrices).
>   ierr = MatGetOwnershipRange(pc->mat,&nmin,&nmax);CHKERRQ(ierr);
>   ndofs = nmax - nmin;
>   ierr = PetscMalloc1(ndofs, &owned_dofs); CHKERRQ(ierr);
>   for(PetscInt i=nmin;i<ndofs;++i)
>     owned_dofs[i] = nmin + i;
>   ierr = ISCreateGeneral(MPI_COMM_WORLD, ndofs, owned_dofs,
> PETSC_OWN_POINTER, &is_owned); CHKERRQ(ierr);
>

Here you tell PETSc to take control of the memory for owned_dofs, but below
you PetscFree(owned_dofs). This is not compatible.
You should just destroy the IS when you are done.

  Thanks,

     Matt


>   // For each IS, embed it to get local coords indces and then set
> coordinates in the subPC.
>   ii=0;
>   while(ilink_current)
>   {
>     IS is_coords;
>     PetscInt ndofs_block;
>     const PetscInt *block_dofs_enumeration; // Numbering of the dofs
> relevant to the current block
>
>     ierr = ISEmbed(ilink_current->is, is_owned, PETSC_TRUE, &is_coords);
> CHKERRQ(ierr); // Setting drop to TRUE, although it should make no
> difference.
>     ierr = PetscMalloc1(ndofs_block, &coords_block); CHKERRQ(ierr);
>     ierr = ISGetLocalSize(is_coords, &ndofs_block); CHKERRQ(ierr);
>     ierr = ISGetIndices(is_coords, &block_dofs_enumeration); CHKERRQ(ierr);
>
>     // Having the indices computed and the memory allocated, we can copy
> the relevant coords and set them to the subPC.
>     for(PetscInt dof=0;dof<ndofs_block;++dof)
>       for(PetscInt d=0;d<dim;++d)
>         {
>           coords_block[dim*dof + d] = coords[dim *
> block_dofs_enumeration[dof] + d];
>           // printf("Dof: %d, Global: %f\n", block_dofs_enumeration[dof],
> coords[dim * block_dofs_enumeration[dof] + d]);
>         }
>     ierr = ISRestoreIndices(is_coords, &block_dofs_enumeration);
> CHKERRQ(ierr);
>     ierr = ISDestroy(&is_coords); CHKERRQ(ierr);
>     ierr = KSPGetPC(ilink_current->ksp, &pc_current); CHKERRQ(ierr);
>     ierr = PCSetCoordinates(pc_current, dim, ndofs_block, coords_block);
> CHKERRQ(ierr);
>     ierr = PetscFree(coords_block); CHKERRQ(ierr);
>     if(!pc_current)
>       SETERRQ(PetscObjectComm((PetscObject)pc),PETSC_ERR_ORDER,"Setting
> coordinates to PCFIELDSPLIT but a subPC is null.");
>
>     ilink_current = ilink_current->next;
>     ++ii;
>   }
>   ierr = PetscFree(owned_dofs); CHKERRQ(ierr);
>   PetscFunctionReturn(0);
> }
>
> On Wed, Jan 12, 2022 at 6:22 AM Barry Smith <bsmith at petsc.dev> wrote:
>
>>
>>
>> On Jan 11, 2022, at 9:51 PM, Matthew Knepley <knepley at gmail.com> wrote:
>>
>> On Tue, Jan 11, 2022 at 3:31 PM Barry Smith <bsmith at petsc.dev> wrote:
>>
>>>
>>>   Nicolas,
>>>
>>>     For "simple" PCFIELDSPLIT it is possible to pass down the attached
>>> coordinate information. By simple I mean where the splitting is done by
>>> fields and not by general lists of IS (where one does not have enough
>>> information to know what the coordinates would mean to the subPCS).
>>>
>>>    Look in fieldsplit.c PCFieldSplitSetFields_FieldSplit() where it does
>>> the KSPCreate(). I think you can do a KSPGetPC() on that ksp and
>>> PCSetCoordinates on that PC to supply the coordinates to the subPC. In the
>>> function PCFieldSplitSetIS_FieldSplit() you can also attach the coordinates
>>> to the subPCs IF defaultsplit is true.
>>>
>>>    Sadly this is not the full story. The outer PC will not have any
>>> coordinates because calling PCSetCoordinates on a PCFIELDSPLIT does nothing
>>> since fieldsplit doesn't handle coordinates. So you need to do more, you
>>> need to provide a PCSetCoordinates_FieldSplit() that saves the coordinates
>>> in new entries in the PC_FieldSplit struct and then in
>>> PCFieldSplitSetFields_FieldSplit() you need to access those saved values
>>> and pass them into the PCSetCoordinates() that you call on the subPCs. Once
>>> you write
>>> PCSetCoordinates_FieldSplit()  you need to call
>>>
>>>   ierr =
>>> PetscObjectComposeFunction((PetscObject)pc,"PCSetCoordinates_C",PCSetCoordinates_FieldSplit);CHKERRQ(ierr);
>>>
>>>
>>> inside PCCreate_FieldSplit().
>>>
>>>    Any questions just let us know.
>>>
>>
>> I will add "Why is this so cumbersome?". This is a workaround in order to
>> get geometric information into GAMG. It should really be
>> PCGAMGSetCoordinates(), which
>> are used to calculate the rigid body modes, and assume a bunch of stuff
>> about the coordinate space. This would not help you, because it would still
>> force you to pull
>> out the correct subPC. The "right" way now to give geometric information
>> to a TS/SNES/KSP/PC is through a DM, which are passed down through
>> PCFIELDSPLIT,
>> PCPATCH, etc. However they are heavier weight than just some coordinates.
>>
>>
>>   This is not cumbersome at all. It is a simple natural way to pass
>> around coordinates to PC's and, when possible, their children.
>>
>>   Barry
>>
>>   Note that we could also have a DMGetCoordinates() that pulled
>> coordinates from a DM (that happended to have them) in this form associated
>> with the PC and the PC could call it to get the coordinates and use them as
>> needed. But this simple PCSetCoordinates() is a nice complement to that
>> approach.
>>
>>
>>   Thanks,
>>
>>     Matt
>>
>>
>>>   Barry
>>>
>>>
>>> > On Jan 11, 2022, at 11:58 AM, Nicolás Barnafi <nabw91 at gmail.com>
>>> wrote:
>>> >
>>> > Dear community,
>>> >
>>> > I am working on a block preconditioner, where one of the blocks uses
>>> HYPRE's AMS. As it requires the coordinates of the dofs, I have done so to
>>> the PC object. I expected the coordinates to be inherited down to the
>>> subblocks, is this not the case? (it seems so as I couldn't find a
>>> specialized FIELDSPLIT SetCoordinates function).
>>> >
>>> > If this feature is missing, please give me some hints on where to add
>>> the missing function, I would gladly do it. If not, please let me know why
>>> it was dismissed, in order to do things the hard way [as in hard-coded ;)].
>>> >
>>> > Kind regards,
>>> > Nicolas
>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>>
>
> --
> Nicolás Alejandro Barnafi Wittwer
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220113/278e2e18/attachment-0001.html>


More information about the petsc-users mailing list