[petsc-users] Eliminating rows and columns which are zeros

Karthikeyan Chockalingam - STFC UKRI karthikeyan.chockalingam at stfc.ac.uk
Fri Feb 10 16:32:38 CST 2023


Yes, but d_nnz_vec and o_nnz_vec  are not of size MPI rank but of size N.

Below is an alternative approach, making both d_nnz_vec and o_nnz_vec  PETSc mpi vecs instead of C++ containers.

Is this okay?

MatSetType(A, MATAIJ);
MatSetSizes(A, PETSC_DECIDE,PETSC_DECIDE,N,N);

PetscInt rstart, rend;
MatGetOwnershipRange(A,&rstart,&rend);

PetscInt local_size = rend – rstart;

VecCreateMPI(PETSC_COMM_WORLD, local_size, N, &d_nnz_vec)
VecDuplicate(d_nnz_vec, o_nnz_vec)

// populate d_nnz_vec and o_nnz_vec but not shown here!

PetscMalloc(local_size * sizeof(PetscInt), &d_nnz);
PetscMalloc(local_size * sizeof(PetscInt), &o_nnz);
PetscMalloc(local_size * sizeof(PetscInt), &indices);


for (int i = 0; i < local_size; i++)
        indices[i]=i+rstart;

VecGetValues(d_nnz_vec, local_size, indices, d_nzz);
VecGetValues(o_nnz_vec, local_size, indices, o_nzz);




MatMPIAIJSetPreallocation(A, max_d_nz, d_nnz, max_o_nz, o_nnz);



PetscFree(d_nnz);

PetscFree(o_nnz);



Kind regards,
Karthik.


From: Barry Smith <bsmith at petsc.dev>
Date: Friday, 10 February 2023 at 20:53
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk>
Cc: petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros



On Feb 10, 2023, at 3:34 PM, Karthikeyan Chockalingam - STFC UKRI <karthikeyan.chockalingam at stfc.ac.uk> wrote:

Thank you for your response.

“But the original big matrix with lots of extra reserved space still exists (your A).”  - I would like to save memory on rows that are ‘inactive’ by making use of d_nnz and o_nnz.

(Q1) When is a row considered “inactive” and dropped by -pc_type redistribute during the solve? Is it when the entire row is zero?

  When there is only a diagonal entry or no entries.


(Q2) If the answer to (Q1) is ‘yes’. Am I right in thinking it would be beneficial to make have 0.0 on the diagonal of the ‘inactive’ rows instead of ‘1.0’?

   Doesn't matter, you can have any value on the diagonal.


Currently, I am setting 0.0 to the diagonal element of the ‘inactive’ rows.
I am propose the following, please let me know if I am heading in the right direction.

(Q3) ) I am planning to create two C++ vectors: d_nnz_vec and o_nnz_vec of N.
Is d_nnz_vec[inactive_row_index] = 0 and o_nnz_vec[inactive_row_index] = 0 or
d_nnz_vec[inactive_row_index] = 1 and Is o_nnz_vec[inactive_row_index] = 0?

d_nnz_vec[active_row_index] = 10 and o_nnz_vec[active_row_index] = 10

  Yes, this seems like the correct scheme, it will not allocate unneeded extra space for inactive rows, exactly the one you want.


(Q4) Is the below correct?

PetscInt * d_nnz = NULL;
PetscInt * o_nnz = NULL;
PetscInt max_d_nz = 10;
PetscInt max_o_nz = 10;


MatSetType(A, MATAIJ);
MatSetSizes(A, PETSC_DECIDE,PETSC_DECIDE,N,N);

PetscInt rstart, rend;
MatGetOwnershipRange(A,&rstart,&rend);

PetscInt local_size = rend – rstart;


PetscMalloc(local_size * sizeof(PetscInt), &d_nnz);
PetscMalloc(local_size * sizeof(PetscInt), &o_nnz);


   for (int i = 0; i < local_size; i++) {

     d_nnz[i] = d_nnz_vec[i + rstart];

     o_nnz[i] = o_nnz_vec[i + rstart];

   }


   Yes, but I think the  + rstart is wrong. It is not needed because the d_nnz and o_nnz are just the size on that MPI rank.



MatMPIAIJSetPreallocation(A,max_d_nz,d_nnz,max_o_nz, o_nnz);


PetscFree(d_nnz);

PetscFree(o_nnz);





Kind regards,
Karthik.



From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Date: Wednesday, 8 February 2023 at 21:10
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros




On Feb 8, 2023, at 11:19 AM, Karthikeyan Chockalingam - STFC UKRI <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>> wrote:

No, I am not calling  MatMPIAIJSetPreallocation(... N,NULL);
Here is what I do:

     PetscInt d_nz = 10;
     PetscInt o_nz = 10;

     ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr);
     ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr);
     ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ(ierr);
     ierr = MatMPIAIJSetPreallocation(A, d_nz, NULL, o_nz, NULL); CHKERRQ(ierr);

(Q1)
As I am setting the size of A to be N x N via

     ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ(ierr);

  d_nz and o_nz determine the memory reserved for A. So if you use for example 10 that means 10 entries are reserved for every row, even inactive ones.




and pre-allocation is done for ALL rows I would like to understand if the ‘inactive rows’ are NOT contributing to memory (while using ‘redistribute’)?

  Redistribute will drop all the inactive rows from the computation of the solve, it generates a smaller matrix missing all those rows and columns. But the original big matrix with lots of extra reserved space still exists (your A).



(Q2)

I tried solving using hypre within redistribute and system converges to a solution. Is below correct way to use hypre within redistribute?

    ierr = PetscOptionsSetValue(NULL,"-ksp_type", "preonly");
    ierr = PetscOptionsSetValue(NULL,"-pc_type", "redistribute");
    ierr = PetscOptionsSetValue(NULL,"-redistribute_ksp_type", "cg");
    ierr = PetscOptionsSetValue(NULL,"-redistribute_pc_type", "hypre");
    ierr = PetscOptionsSetValue(NULL,"-redistribute_pc_hypre_type", "boomeramg");

Correct. You can run with -ksp_view and it will provide all the information about how the solves are layered.




Many thanks,

Karthik.

From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Date: Tuesday, 7 February 2023 at 19:52
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros



On Feb 7, 2023, at 1:20 PM, Karthikeyan Chockalingam - STFC UKRI <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>> wrote:

Thank you Barry for your detailed response.

I would like to shed some light into what I try to accomplish using PETSc and AMReX. Please see the attachment adaptive mesh image (and ignore the mesh-order legend for now).

The number of elements on each level is a geometric progression.
N - Number elements on each level indexed by ‘n’
n - Adaptive mesh level index (starting from 1)
a - Number of elements on the base mesh = 16
r = 4 (each element is divided into four on the next higher level of refinement)

N = a r^(n-1)

The final solution u, is the super imposition of solutions from all levels (here we have a total of 7 levels).

u = u^(1) + u^(2) + … + u^(7)

Hence I have seven system matrix and solution vector pairs, one for each level.

On each level the element index vary from 1 to N. But on each level NOT all elements are ‘active’.
As you can see from the attached image not all elements are active (a lot of white hollow spaces). So the ‘active’ indexes can be scatted anywhere between 1 to N = 65536 for n = 7.

(Q1) In my case, I can’t at the moment insert 1 on the diagonal because during assembly I am using ADD_VALUES as a node can be common to many elements. So I added 0.0 to ALL diagonals. After global assembly, I find that the linear solver converges.

(Q2) After adding 0.0 to all diagonal. I able to solve using either
   ierr = PetscOptionsSetValue(NULL,"-redistribute_pc_type", "jacobi"); CHKERRQ(ierr);
or
   ierr = PetscOptionsSetValue(NULL," pc_type", "jacobi"); CHKERRQ(ierr);
I was able to solve using hypre as well.

Do I need to use -pc_type redistribute or not? Because I am able to solve without it as well.

  No you do not need redistribute, but for large problems with many empty rows using a solver inside redistribute will be faster than just using that solver directly on the much larger (mostly empty) system.



(Q3) I am sorry, if I sound like a broken record player. On each level I request allocation for A[N][N]

  Not sure what you mean by this? Call MatMPIAIJSetPreallocation(... N,NULL); where N is the number of columns in the matrix?

  If so, yes this causes a huge malloc() by PETSc when it allocates the matrix. It is not scalable. Do you have a small upper bound on the number of nonzeros in a row, say 9 or 27? Then use that instead of N, not perfect but much better than N.

 Barry





as the indexes can be scatted anywhere between 1 to N but most are ‘inactive rows’. Is -pc_type redistribute the way to go for me to save on memory? Though I request A[N][N] allocation, and not all rows are active - I wonder if I am wasting a huge amount of memory?

Kind regards,
Karthik.




From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Date: Monday, 6 February 2023 at 22:42
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros

  Sorry was not clear MatZero*. I just meant MatZeroRows() or MatZeroRowsColumns()



On Feb 6, 2023, at 4:45 PM, Karthikeyan Chockalingam - STFC UKRI <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>> wrote:

No problem. I don’t completely follow.

(Q1) I have used MATMPIAJI but not sure what is MatZero* (star) and what it does? And its relevance to my problem.

(Q2) Since I am creating a MATMPIAJI system– what would be the best way to insert 0.0 in to ALL diagonals (both active and inactive rows) to begin with?

  Yes, just have each MPI process loop over its rows and put zero on the diagonal (actually, you could put a 1 if you want). Then have your code use AMReX to
put all its values in, I am assuming the code uses INSERT_VALUES so it will always overwrite the value you put in initially (hence putting in 1 initially will be fine; the advantage of 1 is if you do not use PCREDISTIBUTE the matrix is fully defined and so any solver will work. If you know the inactive rows you can just put the diagonal on those since AMReX will fill up the rest of the rows, but it is harmless to put values on all diagonal entries. Do NOT call MatAssemblyBegin/End between filling the diagonal entries and having AMReX put in its values.

(Q3) If I have to insert 0.0 only into diagonals of “inactive” rows after I have put values into the matrix would be an effort. Unless there is a straight forward to do it in PETSc.

(Q4) For my problem do I need to use PCREDISTIBUTE or any linear solve would eliminate those rows?

   Well no solver will really make sense if you have "inactive" rows, that is rows with nothing in them except PCREDISTIBUTE.

   When PETSc was written we didn't understand having lots of completely empty rows was a use case so much of the functionality does not work in that case.





Best,
Karthik.

From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Date: Monday, 6 February 2023 at 20:18
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros

  Sorry, I had a mistake in my thinking, PCREDISTRIBUTE supports completely empty rows but MatZero* does not.

  When you put values into the matrix you will need to insert a 0.0 on the diagonal of each "inactive" row; all of this will be eliminated during the linear solve process. It would be a major project to change the MatZero* functions to handle empty rows.

  Barry





On Feb 4, 2023, at 12:06 PM, Karthikeyan Chockalingam - STFC UKRI <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>> wrote:

Thank you very much for offering to debug.

I built PETSc along with AMReX, so I tried to extract the PETSc code alone which would reproduce the same error on the smallest sized problem possible.

I have attached three files:

petsc_amrex_error_redistribute.txt – The error message from amrex/petsc interface, but THE linear system solves and converges to a solution.

problem.c – A simple stand-alone petsc code, which produces almost the same error message.

petsc_ error_redistribute.txt – The error message from problem.c but strangely it does NOT solve – I am not sure why?

Please use problem.c to debug the issue.

Kind regards,
Karthik.


From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Date: Saturday, 4 February 2023 at 00:22
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros

   If you can help me reproduce the problem with a simple code I can debug the problem and fix it.

  Barry






On Feb 3, 2023, at 6:42 PM, Karthikeyan Chockalingam - STFC UKRI <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>> wrote:

I updated the main branch to the below commit but the same problem persists.

[0]PETSC ERROR: Petsc Development GIT revision: v3.18.4-529-g995ec06f92  GIT Date: 2023-02-03 18:41:48 +0000


From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Date: Friday, 3 February 2023 at 18:51
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros

   If you switch to use the main branch of petsc https://petsc.org/release/install/download/#advanced-obtain-petsc-development-version-with-git you will not have the problem below (previously we required that a row exist before we zeroed it but now we allow the row to initially have no entries and still be zeroed.

  Barry


On Feb 3, 2023, at 1:04 PM, Karthikeyan Chockalingam - STFC UKRI <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>> wrote:

Thank you. The entire error output was an attachment in my previous email. I am pasting here for your reference.



[1;31m[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0;39m[0;49m[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Matrix is missing diagonal entry in row 0 (65792)
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR: Option left: name:-options_left (no value)
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.18.1-127-ga207d08eda  GIT Date: 2022-10-30 11:03:25 -0500
[0]PETSC ERROR: /Users/karthikeyan.chockalingam/AMReX/amrFEM/build/Debug/amrFEM on a  named HC20210312 by karthikeyan.chockalingam Fri Feb  3 11:10:01 2023
[0]PETSC ERROR: Configure options --with-debugging=0 --prefix=/Users/karthikeyan.chockalingam/AMReX/petsc --download-fblaslapack=yes --download-scalapack=yes --download-mumps=yes --with-hypre-dir=/Users/karthikeyan.chockalingam/AMReX/hypre/src/hypre
[0]PETSC ERROR: #1 MatZeroRowsColumns_SeqAIJ() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/impls/aij/seq/aij.c:2218
[0]PETSC ERROR: #2 MatZeroRowsColumns() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6085
[0]PETSC ERROR: #3 MatZeroRowsColumns_MPIAIJ() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/impls/aij/mpi/mpiaij.c:879
[0]PETSC ERROR: #4 MatZeroRowsColumns() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6085
[0]PETSC ERROR: #5 MatZeroRowsColumnsIS() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6124
[0]PETSC ERROR: #6 localAssembly() at /Users/karthikeyan.chockalingam/AMReX/amrFEM/src/FENodalPoisson.cpp:435
    Residual norms for redistribute_ solve.
    0 KSP preconditioned resid norm 5.182603110407e+00 true resid norm 1.382027496109e+01 ||r(i)||/||b|| 1.000000000000e+00
    1 KSP preconditioned resid norm 1.862430383976e+00 true resid norm 4.966481023937e+00 ||r(i)||/||b|| 3.593619546588e-01
    2 KSP preconditioned resid norm 2.132803507689e-01 true resid norm 5.687476020503e-01 ||r(i)||/||b|| 4.115313216645e-02
    3 KSP preconditioned resid norm 5.499797533437e-02 true resid norm 1.466612675583e-01 ||r(i)||/||b|| 1.061203687852e-02
    4 KSP preconditioned resid norm 2.829814271435e-02 true resid norm 7.546171390493e-02 ||r(i)||/||b|| 5.460217985345e-03
    5 KSP preconditioned resid norm 7.431048995318e-03 true resid norm 1.981613065418e-02 ||r(i)||/||b|| 1.433844891652e-03
    6 KSP preconditioned resid norm 3.182040728972e-03 true resid norm 8.485441943932e-03 ||r(i)||/||b|| 6.139850305312e-04
    7 KSP preconditioned resid norm 1.030867020459e-03 true resid norm 2.748978721225e-03 ||r(i)||/||b|| 1.989091193167e-04
    8 KSP preconditioned resid norm 4.469429300003e-04 true resid norm 1.191847813335e-03 ||r(i)||/||b|| 8.623908111021e-05
    9 KSP preconditioned resid norm 1.237303313796e-04 true resid norm 3.299475503456e-04 ||r(i)||/||b|| 2.387416685085e-05
   10 KSP preconditioned resid norm 5.822094326756e-05 true resid norm 1.552558487134e-04 ||r(i)||/||b|| 1.123391894522e-05
   11 KSP preconditioned resid norm 1.735776150969e-05 true resid norm 4.628736402585e-05 ||r(i)||/||b|| 3.349236115503e-06
  Linear redistribute_ solve converged due to CONVERGED_RTOL iterations 11
KSP Object: (redistribute_) 1 MPI process
  type: cg
  maximum iterations=10000, initial guess is zero
  tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
  left preconditioning
  using PRECONDITIONED norm type for convergence test
PC Object: (redistribute_) 1 MPI process
  type: jacobi
    type DIAGONAL
  linear system matrix = precond matrix:
  Mat Object: 1 MPI process
    type: mpiaij
    rows=48896, cols=48896
    total: nonzeros=307976, allocated nonzeros=307976
    total number of mallocs used during MatSetValues calls=0
      not using I-node (on process 0) routines
End of program
 solve time 0.564714744 seconds
Starting max value is: 0
Min value of level 0 is: 0
Interpolated min value is: 741.978761
Unused ParmParse Variables:
  [TOP]::model.type(nvals = 1)  :: [3]
  [TOP]::ref_ratio(nvals = 1)  :: [2]

AMReX (22.10-20-g3082028e4287) finalized
#PETSc Option Table entries:
-ksp_type preonly
-options_left
-pc_type redistribute
-redistribute_ksp_converged_reason
-redistribute_ksp_monitor_true_residual
-redistribute_ksp_type cg
-redistribute_ksp_view
-redistribute_pc_type jacobi
#End of PETSc Option Table entries
There are no unused options.
Program ended with exit code: 0


Best,
Karthik.

From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Date: Friday, 3 February 2023 at 17:41
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros

   We need all the error output for the errors you got below to understand why the errors are happening.







On Feb 3, 2023, at 11:41 AM, Karthikeyan Chockalingam - STFC UKRI <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>> wrote:

Hello Barry,

I would like to better understand pc_type redistribute usage.

I am plan to use pc_type redistribute in the context of adaptive mesh refinement on a structured grid in 2D. My base mesh (level 0) is indexed from 0 to N-1 elements and refined mesh (level 1) is indexed from 0 to 4(N-1) elements. When I construct system matrix A on (level 1); I probably only use 20% of 4(N-1)  elements, however the indexes are scattered in the range of 0 to 4(N-1). That leaves 80% of the rows and columns of the system matrix A on (level 1) to be zero. From your earlier response, I believe this would be a use case for petsc_type redistribute.

  Indeed the linear solve will be more efficient if you use the redistribute solver.

  But I don't understand your plan. With adaptive refinement I would just create the two matrices, one for the initial grid on which you solve the system, this will be a smaller matrix and then create a new larger matrix for the refined grid (and discard the previous matrix).








Question (1)


If N is really large, I would have to allocate memory of size 4(N-1) for the system matrix A on (level 1). How does pc_type redistribute help? Because, I did end up allocating memory for a large system, where most of the rows and columns are zeros. Is most of the allotted memory not wasted? Is this the correct usage?

  See above








Question (2)


I tried using pc_type redistribute for a two level system.
I have attached the output only for  (level 1)
The solution converges to right solution but still petsc outputs some error messages.

[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR: Option left: name:-options_left (no value)

But the there were no unused options

#PETSc Option Table entries:
-ksp_type preonly
-options_left
-pc_type redistribute
-redistribute_ksp_converged_reason
-redistribute_ksp_monitor_true_residual
-redistribute_ksp_type cg
-redistribute_ksp_view
-redistribute_pc_type jacobi
#End of PETSc Option Table entries
There are no unused options.
Program ended with exit code: 0

I cannot explain this







Question (2)

[0;39m[0;49m[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Matrix is missing diagonal entry in row 0 (65792)

What does this error message imply? Given I only use 20% of 4(N-1) indexes, I can imagine most of the diagonal entrees are zero. Is my understanding correct?


Question (3)







[0]PETSC ERROR: #5 MatZeroRowsColumnsIS() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6124

I am using MatZeroRowsColumnsIS to set the homogenous Dirichelet boundary. I don’t follow why I get this error message as the linear system converges to the right solution.

Thank you for your help.

Kind regards,
Karthik.



From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Date: Tuesday, 10 January 2023 at 18:50
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros

  Yes, after the solve the x will contain correct values for ALL the locations including the (zeroed out rows). You use case is exactly what redistribute it for.

  Barry









On Jan 10, 2023, at 11:25 AM, Karthikeyan Chockalingam - STFC UKRI <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>> wrote:

Thank you Barry. This is great!

I plan to solve using ‘-pc_type redistribute’ after applying the Dirichlet bc using
MatZeroRowsColumnsIS(A, isout, 1, x, b);

While I retrieve the solution data from x (after the solve) – can I index them using the original ordering (if I may say that)?

Kind regards,
Karthik.

From: Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>>
Date: Tuesday, 10 January 2023 at 16:04
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros

https://petsc.org/release/docs/manualpages/PC/PCREDISTRIBUTE/#pcredistribute   -pc_type redistribute


It does everything for you. Note that if the right hand side for any of the "zero" rows is nonzero then the system is inconsistent and the system does not have a solution.

Barry










On Jan 10, 2023, at 10:30 AM, Karthikeyan Chockalingam - STFC UKRI via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:

Hello,

I am assembling a MATIJ of size N, where a very large number of rows (and corresponding columns), are zeros. I would like to potentially eliminate them before the solve.

For instance say N=7

0 0  0  0 0 0 0
0 1 -1  0 0 0 0
0 -1 2  0 0 0 -1
0 0  0  0 0 0 0
0 0  0  0 0 0 0
0 0  0  0 0 0 0
0 0  -1 0 0 0 1

I would like to reduce it to a 3x3

1 -1 0
-1 2 -1
0 -1 1

I do know the size N.

Q1) How do I do it?
Q2) Is it better to eliminate them as it would save a lot of memory?
Q3) At the moment, I don’t know which rows (and columns) have the zero entries but with some effort I probably can find them. Should I know which rows (and columns) I am eliminating?

Thank you.

Karthik.

This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments  and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses.

<petsc_redistribute.txt>

<petsc_error_redistribute.txt><petsc_amrex_error_redistribute.txt><problem.c>

<adaptive_mesh_level.png>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230210/c102aa47/attachment-0001.html>


More information about the petsc-users mailing list