[petsc-users] Eliminating rows and columns which are zeros

Matthew Knepley knepley at gmail.com
Mon Feb 6 15:52:06 CST 2023


On Mon, Feb 6, 2023 at 4:45 PM Karthikeyan Chockalingam - STFC UKRI via
petsc-users <petsc-users at mcs.anl.gov> wrote:

> No problem. I don’t completely follow.
>
>
>
> (Q1) I have used MATMPIAJI but not sure what is MatZero* (star) and what
> it does? And its relevance to my problem.
>

Barry means MatZeroRows(), MatZeroRowsColumns(), etc.


> (Q2) Since I am creating a MATMPIAJI system– what would be the best way to
> insert 0.0 in to ALL diagonals (both active and inactive rows) to begin
> with?
>

I would just write a loop to SetValues on (i,i).


> (Q3) If I have to insert 0.0 only into diagonals of “inactive” rows after
> I have put values into the matrix would be an effort. Unless there is a
> straight forward to do it in PETSc.
>

You can just do it for all rows.


> (Q4) For my problem do I need to use PCREDISTIBUTE or any linear solve
> would eliminate those rows?
>

Only REDISTRIBUTE will eliminate zero rows.

  Thanks,

     Matt


>
>
> Best,
>
> Karthik.
>
>
>
> *From: *Barry Smith <bsmith at petsc.dev>
> *Date: *Monday, 6 February 2023 at 20:18
> *To: *Chockalingam, Karthikeyan (STFC,DL,HC) <
> karthikeyan.chockalingam at stfc.ac.uk>
> *Cc: *petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> *Subject: *Re: [petsc-users] Eliminating rows and columns which are zeros
>
>
>
>   Sorry, I had a mistake in my thinking, PCREDISTRIBUTE supports
> completely empty rows but MatZero* does not.
>
>
>
>   When you put values into the matrix you will need to insert a 0.0 on the
> diagonal of each "inactive" row; all of this will be eliminated during the
> linear solve process. It would be a major project to change the MatZero*
> functions to handle empty rows.
>
>
>
>   Barry
>
>
>
>
>
> On Feb 4, 2023, at 12:06 PM, Karthikeyan Chockalingam - STFC UKRI <
> karthikeyan.chockalingam at stfc.ac.uk> wrote:
>
>
>
> Thank you very much for offering to debug.
>
>
>
> I built PETSc along with AMReX, so I tried to extract the PETSc code alone
> which would reproduce the same error on the smallest sized problem possible.
>
>
>
> I have attached three files:
>
>
>
> petsc_amrex_error_redistribute.txt – The error message from amrex/petsc
> interface, but THE linear system solves and converges to a solution.
>
>
>
> problem.c – A simple stand-alone petsc code, which produces almost the
> same error message.
>
>
>
> petsc_ error_redistribute.txt – The error message from problem.c but
> strangely it does NOT solve – I am not sure why?
>
>
>
> Please use problem.c to debug the issue.
>
>
>
> Kind regards,
>
> Karthik.
>
>
>
>
>
> *From: *Barry Smith <bsmith at petsc.dev>
> *Date: *Saturday, 4 February 2023 at 00:22
> *To: *Chockalingam, Karthikeyan (STFC,DL,HC) <
> karthikeyan.chockalingam at stfc.ac.uk>
> *Cc: *petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> *Subject: *Re: [petsc-users] Eliminating rows and columns which are zeros
>
>
>
>    If you can help me reproduce the problem with a simple code I can debug
> the problem and fix it.
>
>
>
>   Barry
>
>
>
>
>
>
> On Feb 3, 2023, at 6:42 PM, Karthikeyan Chockalingam - STFC UKRI <
> karthikeyan.chockalingam at stfc.ac.uk> wrote:
>
>
>
> I updated the main branch to the below commit but the same problem
> persists.
>
>
>
> *[0]PETSC ERROR: Petsc Development GIT revision:
> v3.18.4-529-g995ec06f92  GIT Date: 2023-02-03 18:41:48 +0000*
>
>
>
>
>
> *From: *Barry Smith <bsmith at petsc.dev>
> *Date: *Friday, 3 February 2023 at 18:51
> *To: *Chockalingam, Karthikeyan (STFC,DL,HC) <
> karthikeyan.chockalingam at stfc.ac.uk>
> *Cc: *petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> *Subject: *Re: [petsc-users] Eliminating rows and columns which are zeros
>
>
>
>    If you switch to use the main branch of petsc
> https://petsc.org/release/install/download/#advanced-obtain-petsc-development-version-with-git you
> will not have the problem below (previously we required that a row exist
> before we zeroed it but now we allow the row to initially have no entries
> and still be zeroed.
>
>
>
>   Barry
>
>
>
>
>
> On Feb 3, 2023, at 1:04 PM, Karthikeyan Chockalingam - STFC UKRI <
> karthikeyan.chockalingam at stfc.ac.uk> wrote:
>
>
>
> Thank you. The entire error output was an attachment in my previous email.
> I am pasting here for your reference.
>
>
>
>
>
>
>
> [1;31m[0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
>
> [0;39m[0;49m[0]PETSC ERROR: Object is in wrong state
>
> [0]PETSC ERROR: Matrix is missing diagonal entry in row 0 (65792)
>
> [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could
> be the program crashed before they were used or a spelling mistake, etc!
>
> [0]PETSC ERROR: Option left: name:-options_left (no value)
>
> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
>
> [0]PETSC ERROR: Petsc Development GIT revision: v3.18.1-127-ga207d08eda
> GIT Date: 2022-10-30 11:03:25 -0500
>
> [0]PETSC ERROR:
> /Users/karthikeyan.chockalingam/AMReX/amrFEM/build/Debug/amrFEM on a  named
> HC20210312 by karthikeyan.chockalingam Fri Feb  3 11:10:01 2023
>
> [0]PETSC ERROR: Configure options --with-debugging=0
> --prefix=/Users/karthikeyan.chockalingam/AMReX/petsc
> --download-fblaslapack=yes --download-scalapack=yes --download-mumps=yes
> --with-hypre-dir=/Users/karthikeyan.chockalingam/AMReX/hypre/src/hypre
>
> [0]PETSC ERROR: #1 MatZeroRowsColumns_SeqAIJ() at
> /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/impls/aij/seq/aij.c:2218
>
> [0]PETSC ERROR: #2 MatZeroRowsColumns() at
> /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6085
>
> [0]PETSC ERROR: #3 MatZeroRowsColumns_MPIAIJ() at
> /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/impls/aij/mpi/mpiaij.c:879
>
> [0]PETSC ERROR: #4 MatZeroRowsColumns() at
> /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6085
>
> [0]PETSC ERROR: #5 MatZeroRowsColumnsIS() at
> /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6124
>
> [0]PETSC ERROR: #6 localAssembly() at
> /Users/karthikeyan.chockalingam/AMReX/amrFEM/src/FENodalPoisson.cpp:435
>
>     Residual norms for redistribute_ solve.
>
>     0 KSP preconditioned resid norm 5.182603110407e+00 true resid norm
> 1.382027496109e+01 ||r(i)||/||b|| 1.000000000000e+00
>
>     1 KSP preconditioned resid norm 1.862430383976e+00 true resid norm
> 4.966481023937e+00 ||r(i)||/||b|| 3.593619546588e-01
>
>     2 KSP preconditioned resid norm 2.132803507689e-01 true resid norm
> 5.687476020503e-01 ||r(i)||/||b|| 4.115313216645e-02
>
>     3 KSP preconditioned resid norm 5.499797533437e-02 true resid norm
> 1.466612675583e-01 ||r(i)||/||b|| 1.061203687852e-02
>
>     4 KSP preconditioned resid norm 2.829814271435e-02 true resid norm
> 7.546171390493e-02 ||r(i)||/||b|| 5.460217985345e-03
>
>     5 KSP preconditioned resid norm 7.431048995318e-03 true resid norm
> 1.981613065418e-02 ||r(i)||/||b|| 1.433844891652e-03
>
>     6 KSP preconditioned resid norm 3.182040728972e-03 true resid norm
> 8.485441943932e-03 ||r(i)||/||b|| 6.139850305312e-04
>
>     7 KSP preconditioned resid norm 1.030867020459e-03 true resid norm
> 2.748978721225e-03 ||r(i)||/||b|| 1.989091193167e-04
>
>     8 KSP preconditioned resid norm 4.469429300003e-04 true resid norm
> 1.191847813335e-03 ||r(i)||/||b|| 8.623908111021e-05
>
>     9 KSP preconditioned resid norm 1.237303313796e-04 true resid norm
> 3.299475503456e-04 ||r(i)||/||b|| 2.387416685085e-05
>
>    10 KSP preconditioned resid norm 5.822094326756e-05 true resid norm
> 1.552558487134e-04 ||r(i)||/||b|| 1.123391894522e-05
>
>    11 KSP preconditioned resid norm 1.735776150969e-05 true resid norm
> 4.628736402585e-05 ||r(i)||/||b|| 3.349236115503e-06
>
>   Linear redistribute_ solve converged due to CONVERGED_RTOL iterations 11
>
> KSP Object: (redistribute_) 1 MPI process
>
>   type: cg
>
>   maximum iterations=10000, initial guess is zero
>
>   tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
>
>   left preconditioning
>
>   using PRECONDITIONED norm type for convergence test
>
> PC Object: (redistribute_) 1 MPI process
>
>   type: jacobi
>
>     type DIAGONAL
>
>   linear system matrix = precond matrix:
>
>   Mat Object: 1 MPI process
>
>     type: mpiaij
>
>     rows=48896, cols=48896
>
>     total: nonzeros=307976, allocated nonzeros=307976
>
>     total number of mallocs used during MatSetValues calls=0
>
>       not using I-node (on process 0) routines
>
> End of program
>
>  solve time 0.564714744 seconds
>
> Starting max value is: 0
>
> Min value of level 0 is: 0
>
> Interpolated min value is: 741.978761
>
> Unused ParmParse Variables:
>
>   [TOP]::model.type(nvals = 1)  :: [3]
>
>   [TOP]::ref_ratio(nvals = 1)  :: [2]
>
>
>
> AMReX (22.10-20-g3082028e4287) finalized
>
> #PETSc Option Table entries:
>
> -ksp_type preonly
>
> -options_left
>
> -pc_type redistribute
>
> -redistribute_ksp_converged_reason
>
> -redistribute_ksp_monitor_true_residual
>
> -redistribute_ksp_type cg
>
> -redistribute_ksp_view
>
> -redistribute_pc_type jacobi
>
> #End of PETSc Option Table entries
>
> There are no unused options.
>
> Program ended with exit code: 0
>
>
>
>
>
> Best,
>
> Karthik.
>
>
>
> *From: *Barry Smith <bsmith at petsc.dev>
> *Date: *Friday, 3 February 2023 at 17:41
> *To: *Chockalingam, Karthikeyan (STFC,DL,HC) <
> karthikeyan.chockalingam at stfc.ac.uk>
> *Cc: *petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> *Subject: *Re: [petsc-users] Eliminating rows and columns which are zeros
>
>
>
>    We need all the error output for the errors you got below to understand
> why the errors are happening.
>
>
>
>
>
>
> On Feb 3, 2023, at 11:41 AM, Karthikeyan Chockalingam - STFC UKRI <
> karthikeyan.chockalingam at stfc.ac.uk> wrote:
>
>
>
> Hello Barry,
>
>
>
> I would like to better understand pc_type redistribute usage.
>
>
>
> I am plan to use pc_type *redistribute* in the context of adaptive mesh
> refinement on a structured grid in 2D. My base mesh (level 0) is indexed
> from 0 to N-1 elements and refined mesh (level 1) is indexed from 0 to
> 4(N-1) elements. When I construct system matrix A on (level 1); I probably
> only use 20% of 4(N-1)  elements, however the indexes are scattered in the
> range of 0 to 4(N-1). That leaves 80% of the rows and columns of the system
> matrix A on (level 1) to be zero. From your earlier response, I believe
> this would be a use case for petsc_type redistribute.
>
>
>
>   Indeed the linear solve will be more efficient if you use the
> redistribute solver.
>
>
>
>   But I don't understand your plan. With adaptive refinement I would just
> create the two matrices, one for the initial grid on which you solve the
> system, this will be a smaller matrix and then create a new larger matrix
> for the refined grid (and discard the previous matrix).
>
>
>
>
>
>
>
>
> Question (1)
>
>
>
>
>
> If N is really large, I would have to allocate memory of size 4(N-1) for
> the system matrix A on (level 1). How does pc_type redistribute help?
> Because, I did end up allocating memory for a large system, where most of
> the rows and columns are zeros. Is most of the allotted memory not wasted?
>  *Is this the correct usage?*
>
>
>
>   See above
>
>
>
>
>
>
>
>
> Question (2)
>
>
>
>
>
> I tried using pc_type redistribute for a two level system.
>
> I have *attached* the output only for  (level 1)
>
> The solution converges to right solution but still petsc outputs some
> error messages.
>
>
>
> [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could
> be the program crashed before they were used or a spelling mistake, etc!
>
> [0]PETSC ERROR: Option left: name:-options_left (no value)
>
>
>
> But the there were no unused options
>
>
>
> #PETSc Option Table entries:
>
> -ksp_type preonly
>
> -options_left
>
> -pc_type redistribute
>
> -redistribute_ksp_converged_reason
>
> -redistribute_ksp_monitor_true_residual
>
> -redistribute_ksp_type cg
>
> -redistribute_ksp_view
>
> -redistribute_pc_type jacobi
>
> #End of PETSc Option Table entries
>
> *There are no unused options.*
>
> Program ended with exit code: 0
>
>
>
> I cannot explain this
>
>
>
>
>
> Question (2)
>
>
>
> [0;39m[0;49m[0]PETSC ERROR: Object is in wrong state
>
> [0]PETSC ERROR: Matrix is missing diagonal entry in row 0 (65792)
>
>
>
> What does this error message imply? Given I only use 20% of 4(N-1)
> indexes, I can imagine most of the diagonal entrees are zero. *Is my
> understanding correct?*
>
>
>
>
>
> Question (3)
>
>
>
>
>
> [0]PETSC ERROR: #5 MatZeroRowsColumnsIS() at
> /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6124
>
>
>
> I am using MatZeroRowsColumnsIS to set the homogenous Dirichelet boundary.
> I don’t follow why I get this error message as the linear system converges
> to the right solution.
>
>
>
> Thank you for your help.
>
>
>
> Kind regards,
>
> Karthik.
>
>
>
>
>
>
>
> *From: *Barry Smith <bsmith at petsc.dev>
> *Date: *Tuesday, 10 January 2023 at 18:50
> *To: *Chockalingam, Karthikeyan (STFC,DL,HC) <
> karthikeyan.chockalingam at stfc.ac.uk>
> *Cc: *petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> *Subject: *Re: [petsc-users] Eliminating rows and columns which are zeros
>
>
>
>   Yes, after the solve the x will contain correct values for ALL the
> locations including the (zeroed out rows). You use case is exactly what
> redistribute it for.
>
>
>
>   Barry
>
>
>
>
>
>
>
>
>
> On Jan 10, 2023, at 11:25 AM, Karthikeyan Chockalingam - STFC UKRI <
> karthikeyan.chockalingam at stfc.ac.uk> wrote:
>
>
>
> Thank you Barry. This is great!
>
>
>
> I plan to solve using ‘-pc_type redistribute’ after applying the Dirichlet
> bc using
>
> MatZeroRowsColumnsIS(A, isout, 1, x, b);
>
>
>
> While I retrieve the solution data from x (after the solve) – can I index
> them using the original ordering (if I may say that)?
>
>
>
> Kind regards,
>
> Karthik.
>
>
>
> *From: *Barry Smith <bsmith at petsc.dev>
> *Date: *Tuesday, 10 January 2023 at 16:04
> *To: *Chockalingam, Karthikeyan (STFC,DL,HC) <
> karthikeyan.chockalingam at stfc.ac.uk>
> *Cc: *petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> *Subject: *Re: [petsc-users] Eliminating rows and columns which are zeros
>
>
>
>
> https://petsc.org/release/docs/manualpages/PC/PCREDISTRIBUTE/#pcredistribute
>  -pc_type redistribute
>
>
>
>
>
> It does everything for you. Note that if the right hand side for any of
> the "zero" rows is nonzero then the system is inconsistent and the system
> does not have a solution.
>
>
>
> Barry
>
>
>
>
>
>
>
>
> On Jan 10, 2023, at 10:30 AM, Karthikeyan Chockalingam - STFC UKRI via
> petsc-users <petsc-users at mcs.anl.gov> wrote:
>
>
>
> Hello,
>
>
>
> I am assembling a MATIJ of size N, where a very large number of rows (and
> corresponding columns), are zeros. I would like to potentially eliminate
> them before the solve.
>
>
>
> For instance say N=7
>
>
>
> 0 0  0  0 0 0 0
>
> 0 1 -1  0 0 0 0
>
> 0 -1 2  0 0 0 -1
>
> 0 0  0  0 0 0 0
>
> 0 0  0  0 0 0 0
>
> 0 0  0  0 0 0 0
>
> 0 0  -1 0 0 0 1
>
>
>
> I would like to reduce it to a 3x3
>
>
>
> 1 -1 0
>
> -1 2 -1
>
> 0 -1 1
>
>
>
> I do know the size N.
>
>
>
> Q1) How do I do it?
>
> Q2) Is it better to eliminate them as it would save a lot of memory?
>
> Q3) At the moment, I don’t know which rows (and columns) have the zero
> entries but with some effort I probably can find them. Should I know which
> rows (and columns) I am eliminating?
>
>
>
> Thank you.
>
>
>
> Karthik.
>
> This email and any attachments are intended solely for the use of the
> named recipients. If you are not the intended recipient you must not use,
> disclose, copy or distribute this email or any of its attachments  and
> should notify the sender immediately and delete this email from your
> system. UK Research and Innovation (UKRI) has taken every reasonable
> precaution to minimise risk of this email or any attachments containing
> viruses or malware but the recipient should carry out its own virus and
> malware checks before opening the attachments. UKRI does not accept any
> liability for any losses or damages which the recipient may sustain due to
> presence of any viruses.
>
>
>
> <petsc_redistribute.txt>
>
>
>
>
> <petsc_error_redistribute.txt><petsc_amrex_error_redistribute.txt><problem.c>
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230206/62b0cfcd/attachment-0001.html>


More information about the petsc-users mailing list