[petsc-dev] Problems with PCReset_FieldSplit

Dmitry Karpeyev karpeev at mcs.anl.gov
Sun Jun 16 14:02:23 CDT 2013


Barry,
I see your point.

To fix PCReset_FieldSplit() correctly, however, the question seems to
be how to error out nicely
when the splits hollowed out by PCReset() are not filled out correctly
using named fields.
It's relatively easy to throw an error when PCFieldSplitSetIS() is
attempting to set a previously-unknown
named field -- that would require PCFieldSplitSetIS() to search
through the list of splits, however.
But what about the links left unfilled (perhaps the DM serving the
splits via DMCreateFieldDecomposition() was reset and now serves a
subset of the previously-used splits)?
Should PCSetUp_FieldSplit() traverse the list and verify the links are
all set up?
Is this a patch to 3.4 or should it go to master?  It seems that
nobody encounters this problem
currently, so maybe there is no urgency to fix this in 3.4.x.

Dmitry.

On Sat, Jun 15, 2013 at 5:56 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>   Dmitry,
>
>     The reuse that is intended for all the XXXReset() is not the time in recreating the objects (which is trivial), it is all the optional parameters etc that have previously been set we want preserved for the next use. That is, it is suppose to preserve the exact solver configuration, but allow new size matrices and vectors.
>
> On Jun 15, 2013, at 2:34 PM, Dmitry Karpeyev <karpeev at mcs.anl.gov> wrote:
>
>> It seems to me that PCReset_FieldSplit is buggy: it doesn't interact
>> correctly with named splits, particularly those set via
>> DMCreateFieldDecomposition() and the like.
>> Does anyone use PCFieldSplit with PCReset()?  It fails for me on
>> problems where splits are named and change (potentially even
>> appearing/disappearing) midway through the SNES iteration.
>
>      Reset is not set up for changing the number of splits etc, that is not its purpose. Presumably in that case one should just destroy the PC field split and rebuild it; since it is not reusing solver options.
>>
>> The reason seems to be this: PCReset_FieldSplit () will just about gut
>> the splits, leaving only the links themselves and the KSPs inside
>> (after KSPReset) hanging out there. Presumably, this is to save the
>> time allocating the links and creating the KSPs.
>>
>> In the olden days the links were populated sequentially based on the
>> presumed (and immutable across PCReset) number of splits -- they were
>> matched up with strided ISs or DMComposite components based on their
>> implied position on the list.
>> When adding named splits, however, PCFieldSplitSetIS() will tack them
>> onto the end of the linked list, even though splits with those names
>> may already be out there (gutted by PCReset).
>
>      This seems to be wrong, it should search through the named splits and generate an error or reuse the split with the same name.
>
>> Unless I'm missing
>> something, this is a problem (as confirmed by my code bombing).
>>
>> One solution would be to make PCFieldSplitSetIS() search the linked
>> list for a split with the given name, and repopulate it, if found.
>> This doesn't solve the problem of a changing number of splits,
>> however.  A simpler, and more general, solution is to simply remove
>> the links altogether, reset the number of splits to 0, issetup =
>> PETSC_FALSE, etc.  That, of course, would eliminate the reuse that's
>> now being attempted, but I'm not sure how much we are really saving
>> there?  If it's okay to use this more radical approach, the fix is in
>> karpeev/fix-pc-fieldsplit-reset.  It's based on maint, and it would be
>> good for it to eventually end up as a patch to petsc-3.4.
>>
>> If there are no objections, I'd like to have the fix branch merged into next.
>
>    If you are changing the solver, which you are by removing splits then you need to destroy the PC and create a new one, not just use PCReset on it.  It doesn't sound like your current karpeev/fix-pc-fieldsplit-reset. branch is the right approach to the problem.
>
>
>
>    Barry
>
>> Thanks.
>> Dmitry.
>



More information about the petsc-dev mailing list