[petsc-dev] fieldsplit + composite + ksp

Pierre Jolivet pierre.jolivet at enseeiht.fr
Wed Sep 18 09:13:45 CDT 2019



> On 18 Sep 2019, at 4:04 PM, Matthew Knepley <knepley at gmail.com> wrote:
> 
> On Wed, Sep 18, 2019 at 10:01 AM Pierre Jolivet <pierre.jolivet at enseeiht.fr <mailto:pierre.jolivet at enseeiht.fr>> wrote:
> 
> 
>> On 18 Sep 2019, at 3:48 PM, Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>> wrote:
>> 
>> On Wed, Sep 18, 2019 at 2:49 AM Pierre Jolivet via petsc-dev <petsc-dev at mcs.anl.gov <mailto:petsc-dev at mcs.anl.gov>> wrote:
>> Hello,
>> I’m solving the following dummy system http://jolivet.perso.enseeiht.fr/composite_ksp.tar.gz <http://jolivet.perso.enseeiht.fr/composite_ksp.tar.gz>
>> [A, B];[C, D], with a PCFIELDSPLIT. For the PC of D, I’m using a PCCOMPOSITE with two sub PCs. One of which is a PCKSP.
>> Could you please help me figure out what is wrong in the following piece of code, that may be launched with the following arguments:
>> $ mpirun -n 1 ./a.out -ksp_type preonly -pc_type fieldsplit -fieldsplit_1_pc_type composite -fieldsplit_1_sub_1_pc_type ksp -fieldsplit_1_sub_1_ksp_ksp_type gmres -fieldsplit_1_sub_1_ksp_pc_type gamg -fieldsplit_1_sub_1_ksp_ksp_converged_reason -fieldsplit_1_sub_1_ksp_pc_gamg_sym_graph 1 -fieldsplit_1_sub_1_ksp_pc_gamg_square_graph 10 -fieldsplit_1_sub_1_ksp_ksp_rtol 1e-8
>> 
>> It solves the dummy system twice, with a varying block D.
>> 
>> Its not the PC, its the matrix. Everything in the PC gets resetup just like you want.
>> 
>> I did MatEqual(S2_1, S2_001, &equal) and equal was false.
> 
> They are not supposed to be equal, so that’s a good thing.
> Or are you doing the comparison _after_ the MatCopy?
> 
> Yes.
>  
> I’m not sure of what is your point, sorry.
> 
> I think they might not actually have identical nonzero structure.

OK, that was easier than expected.

Thanks!
Pierre

>   Thanks,
> 
>      Matt
>  
> Thanks,
> Pierre
> 
>>   Thanks,
>> 
>>      Matt
>>  
>> It should give you:
>>       Linear fieldsplit_1_sub_1_ksp_ solve converged due to CONVERGED_RTOL iterations 8
>> solve #0: 16098.3
>>       Linear fieldsplit_1_sub_1_ksp_ solve did not converge due to DIVERGED_PC_FAILED iterations 0
>>                      PC_FAILED due to SUBPC_ERROR
>> solve #1: inf
>> 
>> If I switch line 70 to #if 0, I get the expected output:
>>       Linear fieldsplit_1_sub_1_ksp_ solve converged due to CONVERGED_RTOL iterations 8
>> solve #0: 16098.3
>>       Linear fieldsplit_1_sub_1_ksp_ solve converged due to CONVERGED_RTOL iterations 8
>> solve #1: 325.448
>> 
>> I’m realizing that this has probably nothing to do with the outer PCFIELDSPLIT, but this comes from a rather large FSI solver, so reproducing this behavior in “only” 97 SLOC is good enough for your I hope.
>> 
>> Thanks in advance,
>> Pierre
>> 
>> 
>> -- 
>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>> -- Norbert Wiener
>> 
>> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20190918/4698ec82/attachment.html>


More information about the petsc-dev mailing list