<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><br class=""><div><blockquote type="cite" class=""><div class="">On 26 Sep 2022, at 2:52 PM, fujisan <<a href="mailto:fujisan43@gmail.com" class="">fujisan43@gmail.com</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><div dir="ltr" class=""><div dir="ltr" class=""><div dir="ltr" class=""><div dir="ltr" class="">Ok, Thank you.<div class="">I didn't know about MatCreateNormal.</div><div class=""><br class=""></div><div class="">In terms of computer performance, what is best to solve Ax=b with A rectangular?</div><div class="">Is it to keep A rectangular and use KSPLSQR along with PCNONE or </div><div class="">to convert to normal equations using MatCreateNormal and use another ksp type with another pc type? </div></div></div></div></div></div></blockquote><div><br class=""></div><div>It’s extremely cheap to call MatCreateNormal().</div><div>It’s just an object that behaves like A^T A, but it’s actually not assembled.</div><br class=""><blockquote type="cite" class=""><div class=""><div dir="ltr" class=""><div dir="ltr" class=""><div dir="ltr" class=""><div dir="ltr" class=""><div class="">In the future, our A will be very sparse and will be something like up to 100 millions lines and 10 millions columns in size.</div></div></div></div></div></div></blockquote><div><br class=""></div><div>Following-up on Matt’s answer, the good news is that if A is indeed very sparse and if A^T A is sparse as well, then you have a couple of options.</div><div>I’d be happy helping you out benchmark those.</div><div>Some of the largest problems I have been investigating thus far are <a href="https://sparse.tamu.edu/Hardesty/Hardesty3" class="">https://sparse.tamu.edu/Hardesty/Hardesty3</a> and <a href="https://sparse.tamu.edu/Mittelmann/cont11_l" class="">https://sparse.tamu.edu/Mittelmann/cont11_l</a></div><div>To this day, PETSc has no Mat partitioner for rectangular systems (like PaToH), so whenever you need to redistribute such systems, the normal equations must be formed explicitly.</div><div>This can get quite costly, but if you already have an application with the proper partitioning, then the sky is the limit, I guess.</div><div><br class=""></div><div>Thanks,</div><div>Pierre</div><br class=""><blockquote type="cite" class=""><div class=""><div dir="ltr" class=""><div dir="ltr" class=""><div dir="ltr" class=""><div dir="ltr" class=""><div class=""><div class="">I will study all that.<br class=""></div></div><div class=""><br class=""></div><div class="">Fuji</div></div></div></div></div><br class=""><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Sep 26, 2022 at 11:45 AM Pierre Jolivet <<a href="mailto:pierre@joliv.et" class="">pierre@joliv.et</a>> wrote:<br class=""></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div style="overflow-wrap: break-word;" class="">I’m sorry, solving overdetermined systems, alongside (overlapping) domain decomposition preconditioners and solving systems with multiple right-hand sides, is one of the topic for which I need to stop pushing new features and update the users manual instead…<div class="">The very basic documentation of PCQR is here: <a href="https://petsc.org/main/docs/manualpages/PC/PCQR" target="_blank" class="">https://petsc.org/main/docs/manualpages/PC/PCQR</a> (I’m guessing you are using the release documentation in which it’s indeed not present).</div><div class="">Some comments about your problem of solving Ax=b with a rectangular matrix A.</div><div class="">1) if you switch to KSPLSQR, it is wrong to use KSPSetOperators(ksp, A, A).</div><div class="">You can get away with murder if your PCType is PCNONE, but otherwise, you should always supply the normal equations as the Pmat (you will get runtime errors otherwise).</div><div class="">To get the normal equations, you can use <a href="https://petsc.org/main/docs/manualpages/Mat/MatCreateNormal/" target="_blank" class="">https://petsc.org/main/docs/manualpages/Mat/MatCreateNormal/</a></div><div class="">The following two points only applies if your Pmat is sparse (or sparse with some dense rows).</div><div class="">2) there are a couple of PC that handle MATNORMAL: PCNONE, PCQR, PCJACOBI, PCBJACOBI, PCASM, PCHPDDM</div><div class="">Currently, PCQR needs SuiteSparse, and thus runs only if the Pmat is distributed on a single MPI process (if your Pmat is distributed on multiple processes, you should first use PCREDUNDANT).</div><div class="">3) if you intend to make your problem scale in sizes, most of these preconditioners will not be very robust.</div><div class="">In that case, if your problem does not have any dense rows, you should either use PCHPDDM or MatConvert(Pmat, MATAIJ, PmatAIJ) and then use PCCHOLESKY, PCHYPRE or PCGAMG (you can have a look at <a href="https://epubs.siam.org/doi/abs/10.1137/21M1434891" target="_blank" class="">https://epubs.siam.org/doi/abs/10.1137/21M1434891</a> for a comparison)</div><div class="">If your problem has dense rows, I have somewhere the code to recast it into an augmented system then solved using PCFIELDSPLIT (see <a href="https://link.springer.com/article/10.1007/s11075-018-0478-2" target="_blank" class="">https://link.springer.com/article/10.1007/s11075-018-0478-2</a>). I can send it to you if needed.</div><div class="">Let me know if I can be of further assistance or if something is not clear to you.</div><div class=""><br class=""></div><div class="">Thanks,</div><div class="">Pierre</div><div class=""><div class=""><br class=""><blockquote type="cite" class=""><div class="">On 26 Sep 2022, at 10:56 AM, fujisan <<a href="mailto:fujisan43@gmail.com" target="_blank" class="">fujisan43@gmail.com</a>> wrote:</div><br class=""><div class=""><div dir="ltr" class="">OK thank you.</div><br class=""><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Sep 26, 2022 at 10:53 AM Jose E. Roman <<a href="mailto:jroman@dsic.upv.es" target="_blank" class="">jroman@dsic.upv.es</a>> wrote:<br class=""></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">The QR factorization from SuiteSparse is sequential only, cannot be used in parallel.<br class="">
In parallel you can try PCBJACOBI with a PCQR local preconditioner.<br class="">
Pierre may have additional suggestions.<br class="">
<br class="">
Jose<br class="">
<br class="">
<br class="">
> El 26 sept 2022, a las 10:47, fujisan <<a href="mailto:fujisan43@gmail.com" target="_blank" class="">fujisan43@gmail.com</a>> escribió:<br class="">
> <br class="">
> I did configure Petsc with the option --download-suitesparse.<br class="">
> <br class="">
> The error is more like this:<br class="">
> PETSC ERROR: Could not locate a solver type for factorization type QR and matrix type mpiaij.<br class="">
> <br class="">
> Fuji<br class="">
> <br class="">
> On Mon, Sep 26, 2022 at 10:25 AM Jose E. Roman <<a href="mailto:jroman@dsic.upv.es" target="_blank" class="">jroman@dsic.upv.es</a>> wrote:<br class="">
> If the error message is "Could not locate a solver type for factorization type QR" then you should configure PETSc with --download-suitesparse<br class="">
> <br class="">
> Jose<br class="">
> <br class="">
> <br class="">
> > El 26 sept 2022, a las 9:06, fujisan <<a href="mailto:fujisan43@gmail.com" target="_blank" class="">fujisan43@gmail.com</a>> escribió:<br class="">
> > <br class="">
> > Thank you Pierre,<br class="">
> > <br class="">
> > I used PCNONE along with KSPLSQR and it worked.<br class="">
> > But as for PCQR, it cannot be found. There is nothing about it in the documentation as well.<br class="">
> > <br class="">
> > Fuji<br class="">
> > <br class="">
> > On Wed, Sep 21, 2022 at 12:20 PM Pierre Jolivet <<a href="mailto:pierre@joliv.et" target="_blank" class="">pierre@joliv.et</a>> wrote:<br class="">
> > Yes, but you need to use a KSP that handles rectangular Mat, such as KSPLSQR (-ksp_type lsqr).<br class="">
> > PCLU does not handle rectangular Pmat. The only PC that handle rectangular Pmat are PCQR, PCNONE.<br class="">
> > If you supply the normal equations as the Pmat for LSQR, then you can use “standard” PC.<br class="">
> > You can have a look at <a href="https://petsc.org/main/src/ksp/ksp/tutorials/ex27.c.html" rel="noreferrer" target="_blank" class="">https://petsc.org/main/src/ksp/ksp/tutorials/ex27.c.html</a> that covers most of these cases.<br class="">
> > <br class="">
> > Thanks,<br class="">
> > Pierre<br class="">
> > <br class="">
> > (sorry for the earlier answer sent wrongfully to petsc-maint, please discard the previous email)<br class="">
> > <br class="">
> >> On 21 Sep 2022, at 10:03 AM, fujisan <<a href="mailto:fujisan43@gmail.com" target="_blank" class="">fujisan43@gmail.com</a>> wrote:<br class="">
> >> <br class="">
> >> I'm trying to solve Ax=b with a sparse rectangular matrix A (of size 33x17 in my test) using<br class="">
> >> options '-ksp_type stcg -pc_type lu' on 1 or 2 cpus.<br class="">
> >> <br class="">
> >> And I always get an error saying "Incompatible vector local lengths" (see below).<br class="">
> >> <br class="">
> >> Here is the relevant lines of my code:<br class="">
> >> <br class="">
> >> program test<br class="">
> >> ...<br class="">
> >> ! Variable declarations<br class="">
> >> <br class="">
> >> PetscCallA(PetscInitialize(PETSC_NULL_CHARACTER,ierr))<br class="">
> >> <br class="">
> >> PetscCall(MatCreate(PETSC_COMM_WORLD,A,ierr))<br class="">
> >> PetscCall(MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,m,n,ierr))<br class="">
> >> PetscCall(MatSetType(A,MATMPIAIJ,ierr))<br class="">
> >> PetscCall(MatSetFromOptions(A,ierr))<br class="">
> >> PetscCall(MatSetUp(A,ierr))<br class="">
> >> PetscCall(MatGetOwnershipRange(A,istart,iend,ierr))<br class="">
> >> <br class="">
> >> do irow=istart,iend-1<br class="">
> >> ... Reading from file ...<br class="">
> >> PetscCall(MatSetValues(A,1,irow,nzv,col,val,ADD_VALUES,ierr))<br class="">
> >> ...<br class="">
> >> enddo<br class="">
> >> <br class="">
> >> PetscCall(MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr))<br class="">
> >> PetscCall(MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr))<br class="">
> >> <br class="">
> >> ! Creating vectors x and b<br class="">
> >> PetscCallA(MatCreateVecs(A,x,b,ierr))<br class="">
> >> <br class="">
> >> ! Duplicating x in u.<br class="">
> >> PetscCallA(VecDuplicate(x,u,ierr))<br class="">
> >> <br class="">
> >> ! u is used to calculate b<br class="">
> >> PetscCallA(VecSet(u,1.0,ierr))<br class="">
> >> <br class="">
> >> PetscCallA(VecAssemblyBegin(u,ierr))<br class="">
> >> PetscCallA(VecAssemblyEnd(u,ierr))<br class="">
> >> <br class="">
> >> ! Calculating Au = b<br class="">
> >> PetscCallA(MatMult(A,u,b,ierr)) ! A.u = b<br class="">
> >> <br class="">
> >> PetscCallA(KSPSetType(ksp,KSPCG,ierr))<br class="">
> >> <br class="">
> >> PetscCallA(KSPSetOperators(ksp,A,A,ierr))<br class="">
> >> <br class="">
> >> PetscCallA(KSPSetFromOptions(ksp,ierr))<br class="">
> >> <br class="">
> >> ! Solving Ax = b, x unknown<br class="">
> >> PetscCallA(KSPSolve(ksp,b,x,ierr))<br class="">
> >> <br class="">
> >> PetscCallA(VecDestroy(x,ierr))<br class="">
> >> PetscCallA(VecDestroy(u,ierr))<br class="">
> >> PetscCallA(VecDestroy(b,ierr))<br class="">
> >> PetscCallA(MatDestroy(A,ierr))<br class="">
> >> PetscCallA(KSPDestroy(ksp,ierr))<br class="">
> >> <br class="">
> >> call PetscFinalize(ierr)<br class="">
> >> end program<br class="">
> >> <br class="">
> >> The code reads a sparse matrix from a binary file.<br class="">
> >> I also output the sizes of matrix A and vectors b, x, u.<br class="">
> >> They all seem consistent.<br class="">
> >> <br class="">
> >> What am I doing wrong?<br class="">
> >> Is it possible to solve Ax=b with A rectangular?<br class="">
> >> <br class="">
> >> Thank you in advance for your help.<br class="">
> >> Have a nice day.<br class="">
> >> <br class="">
> >> Fuji<br class="">
> >> <br class="">
> >> Matrix size : m= 33 n= 17 cpu size: 1<br class="">
> >> Size of matrix A : 33 17<br class="">
> >> Size of vector b : 33<br class="">
> >> Size of vector x : 17<br class="">
> >> Size of vector u : 17<br class="">
> >> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br class="">
> >> [0]PETSC ERROR: Arguments are incompatible<br class="">
> >> [0]PETSC ERROR: Incompatible vector local lengths parameter # 1 local size 33 != parameter # 2 local size 17<br class="">
> >> [0]PETSC ERROR: See <a href="https://petsc.org/release/faq/" rel="noreferrer" target="_blank" class="">https://petsc.org/release/faq/</a> for trouble shooting.<br class="">
> >> [0]PETSC ERROR: Petsc Development GIT revision: v3.17.4-1341-g91b2b62a00 GIT Date: 2022-09-15 19:26:07 +0000<br class="">
> >> [0]PETSC ERROR: ./bin/solve on a x86_64 named master by fujisan Tue Sep 20 16:56:37 2022<br class="">
> >> [0]PETSC ERROR: Configure options --with-petsc-arch=x86_64 --COPTFLAGS="-g -O3" --FOPTFLAGS="-g -O3" --CXXOPTFLAGS="-g -O3" --with-debugging=0 --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort --with-single-library=1 --with-mpiexec=mpiexec --with-precision=double --with-fortran-interfaces=1 --with-make=1 --with-mpi=1 --with-mpi-compilers=1 --download-fblaslapack=0 --download-hypre=1 --download-cmake=0 --with-cmake=1 --download-metis=1 --download-parmetis=1 --download-ptscotch=0 --download-suitesparse=1 --download-triangle=1 --download-superlu=1 --download-superlu_dist=1 --download-scalapack=1 --download-mumps=1 --download-elemental=1 --download-spai=0 --download-parms=1 --download-moab=1 --download-chaco=0 --download-fftw=1 --with-petsc4py=1 --download-mpi4py=1 --download-saws --download-concurrencykit=1 --download-revolve=1 --download-cams=1 --download-p4est=0 --with-zlib=1 --download-mfem=1 --download-glvis=0 --with-opengl=0 --download-libpng=1 --download-libjpeg=1 --download-slepc=1 --download-hpddm=1 --download-bamg=1 --download-mmg=0 --download-parmmg=0 --download-htool=1 --download-egads=0 --download-opencascade=0 PETSC_ARCH=x86_64<br class="">
> >> [0]PETSC ERROR: #1 VecCopy() at /data/softs/petsc/src/vec/vec/interface/vector.c:1607<br class="">
> >> [0]PETSC ERROR: #2 KSPSolve_BiCG() at /data/softs/petsc/src/ksp/ksp/impls/bicg/bicg.c:40<br class="">
> >> [0]PETSC ERROR: #3 KSPSolve_Private() at /data/softs/petsc/src/ksp/ksp/interface/itfunc.c:877<br class="">
> >> [0]PETSC ERROR: #4 KSPSolve() at /data/softs/petsc/src/ksp/ksp/interface/itfunc.c:1048<br class="">
> >> [0]PETSC ERROR: #5 solve.F90:218<br class="">
> >> Abort(75) on node 0 (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 75) - process 0<br class="">
> >> <br class="">
> > <br class="">
> <br class="">
<br class="">
</blockquote></div>
</div></blockquote></div><br class=""></div></div></blockquote></div>
</div></blockquote></div><br class=""></body></html>