[petsc-users] solving saddle point problem in distributed memory

Barry Smith bsmith at petsc.dev
Tue Jul 28 21:49:47 CDT 2020


   No, PETSc direct solvers do not work in parallel. Only SuperLU_DIST, MUMPS, and PasTix. 

    -pc_type lu (or cholesky depending on the package) -pc_factor_mat_solver_type superlu_dist or mumps or pastix  

   to run the direct solver in parallel. 



> On Jul 28, 2020, at 9:01 PM, Bin Liu <lbllm2018 at hotmail.com> wrote:
> 
> Dear Barry,
>  
> Thanks for your explanation. Does it mean the default director solver LU in PETSc can also run in the distributed memory mode? I only used iterative solvers with preconditioners in distributed memory before.  My experience in using director solver in parallel is indeed limited, especially the saddle point problem in parallel. I have done some search online, but I did not find a working set up at this moment. Could you give a sample setup of the direct solver for parallel run? It is really appreciated.
>   <>
> From: Barry Smith [mailto:bsmith at petsc.dev] 
> Sent: Wednesday, 29 July 2020 9:48 AM
> To: Bin Liu <lbllm2018 at hotmail.com>
> Cc: petsc-users at mcs.anl.gov
> Subject: Re: [petsc-users] solving saddle point problem in distributed memory
>  
>  
>    SuperLU_DIST won't "magically" run on saddle point problems. It only does limited pivoting, realistically a parallel LU cannot always do complete pivoting or it becomes a sequential algorithm. For parallel you need to use PCFIELDSPLIT, for sequential you can use SuperLU (not SuperLU_DIST) since it can do more pivoting, being a sequential algorithm.
>  
>   Barry
>  
> 
> 
> On Jul 28, 2020, at 7:59 PM, Bin Liu <lbllm2018 at hotmail.com <mailto:lbllm2018 at hotmail.com>> wrote:
>  
> Thanks for your tutorials. Yes. I tried PCFIELDSPLIT. However, it only works for sequential runs. When I run the code in distributed memory, it reports errors. In fact, the essence of my wonder is (a) how to set up superlu_dist in petsc for solving saddle point problem in distributed memory? (b) does the direct solvers in petsc can run in distributed memory for solving saddle point problem?
>  
> From: Stefano Zampini [mailto:stefano.zampini at gmail.com <mailto:stefano.zampini at gmail.com>] 
> Sent: Tuesday, 28 July 2020 6:55 PM
> To: Bin Liu <lbllm2018 at hotmail.com <mailto:lbllm2018 at hotmail.com>>
> Cc: petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>
> Subject: Re: [petsc-users] solving saddle point problem in distributed memory
>  
> If you want advice you should post the error trace PETSc reports.
>  
> Anyway, solving Stokes is not so trivial (without direct solvers, you may need mesh dependent information), but we have examples for it
>  
> https://gitlab.com/petsc/petsc/-/blob/master/src/ksp/ksp/tutorials/ex42.c <https://gitlab.com/petsc/petsc/-/blob/master/src/ksp/ksp/tutorials/ex42.c>
> https://gitlab.com/petsc/petsc/-/blob/master/src/ksp/ksp/tutorials/ex43.c <https://gitlab.com/petsc/petsc/-/blob/master/src/ksp/ksp/tutorials/ex43.c>
> https://gitlab.com/petsc/petsc/-/blob/master/src/snes/tutorials/ex69.c <https://gitlab.com/petsc/petsc/-/blob/master/src/snes/tutorials/ex69.c>
>  
> If you scroll to the end of those files, you see a bunch of possible options either using PCFIELDSPLIT,  PCBDDC or KSPFETIDP.
>  
> 
> 
> 
> On Jul 28, 2020, at 12:37 PM, Bin Liu <lbllm2018 at hotmail.com <mailto:lbllm2018 at hotmail.com>> wrote:
>  
> I would like to solve a saddle point problem arising from the stokes equation. I got successful to use the direct solvers in sequential runs. However, I would like to extend it for distributed memory computation. I tried to use superlu_dist, but the program returns errors. Is it possible to solve a saddle point problem in distributed memory using superlu_dist? Could anyone give a simple sample code to set up the parameters of the solver?
>  
> Thanks
>  

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200728/c8dab8f5/attachment-0001.html>


More information about the petsc-users mailing list