[petsc-users] Fieldsplit with redistribute

Barry Smith bsmith at petsc.dev
Wed Apr 26 10:21:02 CDT 2023


  Perhaps there is a misunderstanding. With the code I added you always just provide IS for the original problem without any concern for what rows will be zeroed and what rows will be redistributed. PCREDISTRIBUTE manages "fixing" things. 

   I want you to use PCFIEDLSPLIT and not PCREDISTRIBUTE at all (so -pc_type fieldsplit) with your IS and see if that works correctly sequentially and in parallel. 

   Then I want you to use -pc_type redistribute -redistribute_pc_type fieldsplit with your IS sequentially and in parallel (note you do not change your code, or even recompiled it for all the cases.)

  Barry




> On Apr 26, 2023, at 10:58 AM, Carl-Johan Thore <carl-johan.thore at liu.se> wrote:
> 
> Because without redistribute, pcfieldsplit expects number pointing to rows in the “big” matrix, whereas when
> I construct my IS for redistribute it will have indices pointing to rows in the “reduced matrix”. Coming back to my small
> example below, I construct an IS with 0 2 4 as expected (?) for the reduced matrix.

> If I pass 0 2 4 to the big matrix I expect the wrong result. If I skip the part in the construction of my IS where
> 1 3 7 gets converted to 0 2 4 it works without redistribute.
>  
> I’m using MatZeroRowsColumns in all cases yes.
>  
> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> 
> Sent: Wednesday, April 26, 2023 4:47 PM
> To: Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>>
> Cc: PETSc <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
> Subject: Re: [petsc-users] Fieldsplit with redistribute
>  
>  
> 
> 
> On Apr 26, 2023, at 10:32 AM, Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>> wrote:
>  
> One 1 core there is no errors, but the solution to the linear system is wrong as expected.
>  
>    Why is it expect to be wrong?  Are you still using the MatZeroRowsColumns()? You should.
> 
> 
> On 2 cores I get this:
>  
> [0]ISdata: min=  139, max= 6292, freeudofs= 4459. min=    0, max= 6292, freedofs= 6293:  row 2629
> [1]ISdata: min= 6301, max=10639, freeudofs= 3244. min= 6293, max=10639, freedofs= 4347:  row 2629
> [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [1]PETSC ERROR: Argument out of range
> [1]PETSC ERROR: Index 0's value 6301 is smaller than minimum given 19845
> [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [1]PETSC ERROR: Petsc Development GIT revision: v3.19.0-236-gee39b84cc03  GIT Date: 2023-04-23 18:43:23 -0400
> [1]PETSC ERROR: topopt on a arch-linux-c-debug named win01705 by carlthore Wed Apr 26 16:26:24 2023
> [1]PETSC ERROR: Configure options -f --with-cuda --with-cusp --download-scalapack --download-hdf5 --download-zlib --download-mumps --download-parmetis --download-metis --download-ptscotch --download-hypre --download-spai
> [1]PETSC ERROR: #1 ISComplement() at /mnt/c/mathware/petsc/src/vec/is/is/utils/iscoloring.c:803
> [1]PETSC ERROR: #2 PCFieldSplitSetDefaults() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:544
> [1]PETSC ERROR: #3 PCSetUp_FieldSplit() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:587
> [1]PETSC ERROR: #4 PCSetUp() at /mnt/c/mathware/petsc/src/ksp/pc/interface/precon.c:994
> [1]PETSC ERROR: #5 KSPSetUp() at /mnt/c/mathware/petsc/src/ksp/ksp/interface/itfunc.c:406
> [1]PETSC ERROR: #6 SetUpSolver() at /mnt/c/TOPet/fts_topopt_in_petsc-master/MixedStokes.cc:2650 <http://mixedstokes.cc:2650/>
>  
>  
> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> 
> Sent: Wednesday, April 26, 2023 4:11 PM
> To: Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>>
> Cc: PETSc <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
> Subject: Re: [petsc-users] Fieldsplit with redistribute
>  
>  
>   What happens if you pass in your IS using directly PCFIELDSPLIT and not using PCREDISTRIBUTE? 
> 
> 
> 
> On Apr 26, 2023, at 2:27 AM, Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>> wrote:
>  
> Hi again,
>  
> I now think I got my IS  in order (it’s just one IS because unlike in your ex84.c I don’t provide the complement of the IS explicitly but let fieldsplit compute it, but ex84.c works fine if I do the same there).
> As before, my code works with pcredistribute and pcfieldsplit on 1 core. I then try
> with 2 cores. First I check the IS in Matlab, and it looks fine as far as I can tell, with identical content
> as in the 1-core case, which it should?. Then I try running the code, but it fails with (the first two lines are mine)
>  
>> [1]ISdata: min= 6301, max=10639, freeudofs= 3244. min= 6293, max=10639, freedofs= 4347:  row 2629
> [0]ISdata: min=  139,  max=  6292,  freeudofs= 4459. min=       0, max= 6292, freedofs= 6293:  row 2629
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Argument out of range
> [0]PETSC ERROR: Index 3748's value 5320 is larger than maximum given 5320
> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.19.0-236-gee39b84cc03  GIT Date: 2023-04-23 18:43:23 -0400
> [0]PETSC ERROR: topopt on a arch-linux-c-debug named win01705 by carlthore Wed Apr 26 08:01:49 2023
> [0]PETSC ERROR: Configure options -f --with-cuda --with-cusp --download-scalapack --download-hdf5 --download-zlib --download-mumps --download-parmetis --download-metis --download-ptscotch --download-hypre --download-spai
> [0]PETSC ERROR: #1 ISComplement() at /mnt/c/mathware/petsc/src/vec/is/is/utils/iscoloring.c:804
> [0]PETSC ERROR: #2 PCFieldSplitSetDefaults() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:544
> [0]PETSC ERROR: #3 PCSetUp_FieldSplit() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:587
> [0]PETSC ERROR: #4 PCSetUp() at /mnt/c/mathware/petsc/src/ksp/pc/interface/precon.c:994
> [0]PETSC ERROR: #5 KSPSetUp() at /mnt/c/mathware/petsc/src/ksp/ksp/interface/itfunc.c:406
> [0]PETSC ERROR: #6 PCSetUp_Redistribute() at /mnt/c/mathware/petsc/src/ksp/pc/impls/redistribute/redistribute.c:327
> [0]PETSC ERROR: #7 PCSetUp() at /mnt/c/mathware/petsc/src/ksp/pc/interface/precon.c:994
> [0]PETSC ERROR: #8 KSPSetUp() at /mnt/c/mathware/petsc/src/ksp/ksp/interface/itfunc.c:406
> [0]PETSC ERROR: #9 SetUpSolver() at /mnt/c/TOPet/fts_topopt_in_petsc-master/MixedStokes.cc:2650 <http://mixedstokes.cc:2650/>
>>  
> This is all the errors I see, I’m not sure why the error message is not written out in full. I guess the error is on
> my side, with the IS still not being constructed correctly, but what do you think?
>  
> Kind regards,
> Carl-Johan
>  
>  
> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> 
> Sent: Monday, April 24, 2023 8:27 PM
> To: Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>>
> Cc: PETSc <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
> Subject: Re: [petsc-users] Fieldsplit with redistribute
>  
>  
>   PCREDISTRIBUTE looks like it is not yet GPU friendly. This needs to be fixed, but it should be a separate fix and MR from my current one. 
>  
>   Please just check if the PCREDISTRIBUTE followed by PCFIELDSPLIT just works on CPUs for your code.
>  
>   Barry
>  
> 
> 
> 
> 
> On Apr 24, 2023, at 2:23 PM, Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>> wrote:
>  
> I wasn’t sure if I was going to bother you again with this, but since it looks like you plan to merge this with the
> main branch (?) I thought it might be interesting to know that I’ve tried this with my code running with CUDA but got the
> attached error. I suspect it’s related to the RHS red->b but I’m not sure. If I switch of redistribute my code runs fine
> with CUDA.
>  
> Kind regards,
> Carl-Johan
>  
>  
>  
> From: Carl-Johan Thore 
> Sent: Monday, April 24, 2023 5:08 PM
> To: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>>
> Subject: RE: [petsc-users] Fieldsplit with redistribute
>  
> Ok, that worked great with my code on 1 core! (I haven’t been able to try the multi-core case yet due to issues with my own code
> mentioned below)
>  
> I’m not sure if you forgot to remove the freeing of the map object
> outside or if I messed up with the pull somehow, but I had to outcomment that line manually:
>  

>  
> /Carl-Johan
>  
>  
>  
> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> 
> Sent: Monday, April 24, 2023 4:26 PM
> To: Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>>
> Cc: PETSc <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
> Subject: Re: [petsc-users] Fieldsplit with redistribute
>  
>  
>   The bug was mine; I was freeing the map object outside of the if () instead of inside. You can do
>  
>    git pull
>    make all
>  
>    and then try again.
>  
>    Barry
>  
>  
> 
> On Apr 24, 2023, at 5:39 AM, Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>> wrote:
>  
> Hi Barry!
>  
> First of all, thank you very very much for this! I was expecting maybe a few hints and pointers on how to proceed
> with my work, but then you did a complete implementation … 
>  
> Your code ran fine with ex84.cc <http://ex84.cc/>. Unfortunately it crashed when running on my main code (a mixed Stokes solver).
> When running on 1 core I get a crash which is maybe related to your code, so I’ve attached the error message for that
> case. However, on multiple cores I think the issue is mainly that I’m not constructing the original IS correctly, so I’ll
> look into that myself. 
>  
> Regarding reporting to https://gitlab.com/petsc/petsc/-/merge_requests/6366, should it be done here?:

>  
> By the way, I managed yesterday to make a working implementation of my own example and was planning to send it
> after cleaning it up and maybe optimizing a bit. I’ve attached it if your curious (or just want to have a good laugh :))
>  
> Kind regards,
> Carl-Johan
>  
>  
> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> 
> Sent: Monday, April 24, 2023 12:49 AM
> To: Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>>
> Cc: PETSc <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
> Subject: Re: [petsc-users] Fieldsplit with redistribute
>  
>  
>    I have added support for PCREDISTRIBUTE to propogate your PCFieldSplitSetIS() down to an inner PCFIELDSPLIT. You can 
> access it with
>  
>    git fetch
>    git checkout barry/2023-04-22/fieldsplit-fields-propogate
>   ./configure
>    make all check
>  
>    Take a look at src/ksp/ksp/tutorials/ex84.c and run with the options at the bottom of the file.
>  
>    Please let us know at  https://gitlab.com/petsc/petsc/-/merge_requests/6366 if it works for you or you have any difficulties.
>  
>   Barry
>  
>  
>  
> 
> On Apr 20, 2023, at 10:14 AM, Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>> wrote:
>  
> Great, thanks! I’ve attached the code, a makefile, and a 1-page power-point which hopefully explains
> what I’m trying to do on this little toy-problem. There is obviously (?) something I need to add around
> line 327 in the code in order to move the indices to the correct rank.
>  
> Output should be something like this when running:
>  

>  
> Let me know if you need any more info, or if the code is incomprehensible or so
> (it’s long because I’ve copied a lot from redistribute.c)
>  
> Kind regards,
> Carl-Johan
>  
> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> 
> Sent: Thursday, April 20, 2023 3:17 PM
> To: Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>>
> Subject: Re: [petsc-users] Fieldsplit with redistribute
>  
>  
>   Sure
>  
> 
> On Apr 20, 2023, at 4:09 AM, Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>> wrote:
>  
> Hi Barry,
>  
> In the conversation below you mentioned that I could send code to you to take a look. I’ve written
> up what I think is a minimally working example for this. It’s almost there in the sense of distributing
> the correct number of indices to the ranks to match the reduced matrix, but it’s the wrong indices.
> Would it be okay if I sent you the code to have look?
>  
> Kind regards,
> Carl-Johan 
>  
> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> 
> Sent: Sunday, April 16, 2023 10:31 PM
> To: Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>>
> Cc: petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>
> Subject: Re: [petsc-users] Fieldsplit with redistribute
>  
>  
>    The manual page for ISEmbed is incomprehensible to me. Anyways no matter what, you need to know what degrees of freedom are removed by PCDistribute() in order to produce the reduced IS which is why I think you need information only available inside PCSetUp_Redistribute(). (Sorry it is PCSetUp_Redistribute() not PCApply_Redistribute())
>  
>   Barry
>  
>  
> 
> On Apr 16, 2023, at 3:36 PM, Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>> wrote:
>  
> Thanks for the quick reply Barry!
> I have not tried the version with PCApply_Redistribute that you suggest, but I have a code that does roughly what you describe. It works when running on one rank, but fails on multiple ranks. I suspect the issue is with the use of ISEmbed as, quoting the PETSc-manual, "the resulting IS is sequential, since the index substitution it encodes is purely local" (admittedly I don't fully understand what that means). If you think using ISEmbed is not a good idea, I'll try PCApply_Redistribute()
> From: Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>>
> Sent: 16 April 2023 21:11:18
> To: Carl-Johan Thore <carl-johan.thore at liu.se <mailto:carl-johan.thore at liu.se>>
> Cc: petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
> Subject: Re: [petsc-users] Fieldsplit with redistribute 
>  
> 
>    There is no code to do this currently. 
> 
>     I would start by building your IS for each split before the PCRedistribute and then adding to the PCApply_Redistribute() code that "fixes" these IS by "removing" the entries of the IS associated with removed degrees of freedom and then shifting the entries indices of the IS by taking into account the removed indices. But you have probably already been trying this? It does require digging directly into the PCApply_Redistribute() to get the needed information (which degrees of freedom are removed by the redistribute code), plus it requires shifting the MPI rank ownership of the entries of the IS in the same way the MPI rank ownership of the degrees of freedom of the vector are moved.
> 
>    If you have some code that you think should be doing this but doesn't work feel free to send it to us and we may be able to fix it.
> 
>   Barry
> 
> 
> > On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>> wrote:
> > 
> > Hello,
> > I'm solving a blocksystem
> > [A C;
> > C' D],
> > where D is not zero, using the PCFIELDSPLIT preconditioner and set the split using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE (which is attractive as I have many locked DOFs). I suspect something goes wrong when constructing the IS for the split (I've tried various things using the IS-routines). Can PETSc do this automatically? Or else, any hints?
> > Kind regards,
> > Carl-Johan
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230426/641f8ec6/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image004.png
Type: image/png
Size: 45433 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230426/641f8ec6/attachment-0004.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image005.png
Type: image/png
Size: 79741 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230426/641f8ec6/attachment-0005.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image006.png
Type: image/png
Size: 70472 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230426/641f8ec6/attachment-0006.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image007.png
Type: image/png
Size: 25214 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230426/641f8ec6/attachment-0007.png>


More information about the petsc-users mailing list