[petsc-users] solving saddle point problem in distributed memory

Matthew Knepley knepley at gmail.com
Wed Jul 29 04:20:44 CDT 2020


On Tue, Jul 28, 2020 at 11:33 PM Bin Liu <lbllm2018 at hotmail.com> wrote:

> Dear Barry,
>
> Thanks for your suggestions. I just tested it. I still get the same error.
> I attached it below:
>
> [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [1]PETSC ERROR: Object is in wrong state
> [1]PETSC ERROR: Matrix is missing diagonal entry in the zeroed row 24214
> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [1]PETSC ERROR: Petsc Release Version 3.9.1, Apr, 29, 2018
> [1]PETSC ERROR: ./re100 on a  named kdp-MS-7850 by kdp Wed Jul 29 11:26:57
> 2020
> [1]PETSC ERROR: Configure options --prefix=/home/kdp/petsc --debugging=0
> --with-mpi-dir=/home/kdp/mpich --download-fblaslapack --download-metis
> --download-parmetis --download-hypre=/home/kdp/hypre-2.14.0.tar.gz
> --download-superlu_dist
> [1]PETSC ERROR: #1 MatZeroRows_SeqAIJ() line 1819 in
> /home/kdp/petsc-3.9.1/src/mat/impls/aij/seq/aij.c
> [1]PETSC ERROR: #2 MatZeroRows() line 5710 in
> /home/kdp/petsc-3.9.1/src/mat/interface/matrix.c
> [1]PETSC ERROR: #3 MatZeroRows_MPIAIJ() line 797 in
> /home/kdp/petsc-3.9.1/src/mat/impls/aij/mpi/mpiaij.c
> [1]PETSC ERROR: #4 MatZeroRows() line 5710 in
> /home/kdp/petsc-3.9.1/src/mat/interface/matrix.c
>
> It still report missing diagonal entry. However I have epxlicitly assign
> 0.0 to the diagonal element of the system matrix and solve it in sequential
> mode without error. Where could possibly my wrong set up be?
>

1) Please send the complete error message. The stack trace here is cut off.

2) It does not look like it is failing in the solver, but in the call to
MatZeroRows(), which as it says, requires diagonal elements

3) It looks like you are not assembling the matrix correctly in parallel

I recommend making as small a problem as possible, running on 1 and 2
processes, and printing out the matrix using -mat_view to compare them
entry-by-entry.

  Thanks,

     Matt

Best
>
> ------------------------------
> *From:* Barry Smith <bsmith at petsc.dev>
> *Sent:* Wednesday, July 29, 2020 10:49 AM
> *To:* Bin Liu <lbllm2018 at hotmail.com>
> *Cc:* petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> *Subject:* Re: [petsc-users] solving saddle point problem in distributed
> memory
>
>
>    No, PETSc direct solvers do not work in parallel. Only SuperLU_DIST,
> MUMPS, and PasTix.
>
>     -pc_type lu (or cholesky depending on the package)
> -pc_factor_mat_solver_type superlu_dist or mumps or pastix
>
>    to run the direct solver in parallel.
>
>
>
> On Jul 28, 2020, at 9:01 PM, Bin Liu <lbllm2018 at hotmail.com> wrote:
>
> Dear Barry,
>
>
>
> Thanks for your explanation. Does it mean the default director solver LU
> in PETSc can also run in the distributed memory mode? I only used iterative
> solvers with preconditioners in distributed memory before.  My experience
> in using director solver in parallel is indeed limited, especially the
> saddle point problem in parallel. I have done some search online, but I did
> not find a working set up at this moment. Could you give a sample setup of
> the direct solver for parallel run? It is really appreciated.
>
>
>
> *From:* Barry Smith [mailto:bsmith at petsc.dev <bsmith at petsc.dev>]
> *Sent:* Wednesday, 29 July 2020 9:48 AM
> *To:* Bin Liu <lbllm2018 at hotmail.com>
> *Cc:* petsc-users at mcs.anl.gov
> *Subject:* Re: [petsc-users] solving saddle point problem in distributed
> memory
>
>
>
>
>
>    SuperLU_DIST won't "magically" run on saddle point problems. It only
> does limited pivoting, realistically a parallel LU cannot always do
> complete pivoting or it becomes a sequential algorithm. For parallel you
> need to use PCFIELDSPLIT, for sequential you can use SuperLU (not
> SuperLU_DIST) since it can do more pivoting, being a sequential algorithm.
>
>
>
>   Barry
>
>
>
>
>
> On Jul 28, 2020, at 7:59 PM, Bin Liu <lbllm2018 at hotmail.com> wrote:
>
>
>
> Thanks for your tutorials. Yes. I tried PCFIELDSPLIT. However, it only
> works for sequential runs. When I run the code in distributed memory, it
> reports errors. In fact, the essence of my wonder is (a) how to set up
> superlu_dist in petsc for solving saddle point problem in distributed
> memory? (b) does the direct solvers in petsc can run in distributed memory
> for solving saddle point problem?
>
>
>
> *From:* Stefano Zampini [mailto:stefano.zampini at gmail.com
> <stefano.zampini at gmail.com>]
> *Sent:* Tuesday, 28 July 2020 6:55 PM
> *To:* Bin Liu <lbllm2018 at hotmail.com>
> *Cc:* petsc-users at mcs.anl.gov
> *Subject:* Re: [petsc-users] solving saddle point problem in distributed
> memory
>
>
>
> If you want advice you should post the error trace PETSc reports.
>
>
>
> Anyway, solving Stokes is not so trivial (without direct solvers, you may
> need mesh dependent information), but we have examples for it
>
>
>
> https://gitlab.com/petsc/petsc/-/blob/master/src/ksp/ksp/tutorials/ex42.c
>
> https://gitlab.com/petsc/petsc/-/blob/master/src/ksp/ksp/tutorials/ex43.c
>
> https://gitlab.com/petsc/petsc/-/blob/master/src/snes/tutorials/ex69.c
>
>
>
> If you scroll to the end of those files, you see a bunch of possible
> options either using PCFIELDSPLIT,  PCBDDC or KSPFETIDP.
>
>
>
>
>
>
> On Jul 28, 2020, at 12:37 PM, Bin Liu <lbllm2018 at hotmail.com> wrote:
>
>
>
> I would like to solve a saddle point problem arising from the stokes
> equation. I got successful to use the direct solvers in sequential runs.
> However, I would like to extend it for distributed memory computation. I
> tried to use superlu_dist, but the program returns errors. Is it possible
> to solve a saddle point problem in distributed memory using superlu_dist?
> Could anyone give a simple sample code to set up the parameters of the
> solver?
>
>
>
> Thanks
>
>
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200729/9bc6247f/attachment-0001.html>


More information about the petsc-users mailing list