[petsc-users] Problem solving Ax=b with rectangular matrix A

Matthew Knepley knepley at gmail.com
Mon Sep 26 07:58:42 CDT 2022


On Mon, Sep 26, 2022 at 8:52 AM fujisan <fujisan43 at gmail.com> wrote:

> Ok, Thank you.
> I didn't know about MatCreateNormal.
>
> In terms of computer performance, what is best to solve Ax=b with A
> rectangular?
> Is it to keep A rectangular and use KSPLSQR along with PCNONE or
> to convert to normal equations using MatCreateNormal and use another ksp
> type with another pc type?
>

It depends on the spectral characteristics of the normal equations.

  Thanks,

     Matt


> In the future, our A will be very sparse and will be something like up to
> 100 millions lines and 10 millions columns in size.
>
> I will study all that.
>
> Fuji
>
> On Mon, Sep 26, 2022 at 11:45 AM Pierre Jolivet <pierre at joliv.et> wrote:
>
>> I’m sorry, solving overdetermined systems, alongside (overlapping) domain
>> decomposition preconditioners and solving systems with multiple right-hand
>> sides, is one of the topic for which I need to stop pushing new features
>> and update the users manual instead…
>> The very basic documentation of PCQR is here:
>> https://petsc.org/main/docs/manualpages/PC/PCQR (I’m guessing you are
>> using the release documentation in which it’s indeed not present).
>> Some comments about your problem of solving Ax=b with a rectangular
>> matrix A.
>> 1) if you switch to KSPLSQR, it is wrong to use KSPSetOperators(ksp, A,
>> A).
>> You can get away with murder if your PCType is PCNONE, but otherwise, you
>> should always supply the normal equations as the Pmat (you will get runtime
>> errors otherwise).
>> To get the normal equations, you can use
>> https://petsc.org/main/docs/manualpages/Mat/MatCreateNormal/
>> The following two points only applies if your Pmat is sparse (or sparse
>> with some dense rows).
>> 2) there are a couple of PC that handle MATNORMAL: PCNONE, PCQR,
>> PCJACOBI, PCBJACOBI, PCASM, PCHPDDM
>> Currently, PCQR needs SuiteSparse, and thus runs only if the Pmat is
>> distributed on a single MPI process (if your Pmat is distributed on
>> multiple processes, you should first use PCREDUNDANT).
>> 3) if you intend to make your problem scale in sizes, most of these
>> preconditioners will not be very robust.
>> In that case, if your problem does not have any dense rows, you should
>> either use PCHPDDM or MatConvert(Pmat, MATAIJ, PmatAIJ) and then use
>> PCCHOLESKY, PCHYPRE or PCGAMG (you can have a look at
>> https://epubs.siam.org/doi/abs/10.1137/21M1434891 for a comparison)
>> If your problem has dense rows, I have somewhere the code to recast it
>> into an augmented system then solved using PCFIELDSPLIT (see
>> https://link.springer.com/article/10.1007/s11075-018-0478-2). I can send
>> it to you if needed.
>> Let me know if I can be of further assistance or if something is not
>> clear to you.
>>
>> Thanks,
>> Pierre
>>
>> On 26 Sep 2022, at 10:56 AM, fujisan <fujisan43 at gmail.com> wrote:
>>
>> OK thank you.
>>
>> On Mon, Sep 26, 2022 at 10:53 AM Jose E. Roman <jroman at dsic.upv.es>
>> wrote:
>>
>>> The QR factorization from SuiteSparse is sequential only, cannot be used
>>> in parallel.
>>> In parallel you can try PCBJACOBI with a PCQR local preconditioner.
>>> Pierre may have additional suggestions.
>>>
>>> Jose
>>>
>>>
>>> > El 26 sept 2022, a las 10:47, fujisan <fujisan43 at gmail.com> escribió:
>>> >
>>> > I did configure Petsc with the option --download-suitesparse.
>>> >
>>> > The error is more like this:
>>> > PETSC ERROR: Could not locate a solver type for factorization type QR
>>> and matrix type mpiaij.
>>> >
>>> > Fuji
>>> >
>>> > On Mon, Sep 26, 2022 at 10:25 AM Jose E. Roman <jroman at dsic.upv.es>
>>> wrote:
>>> > If the error message is "Could not locate a solver type for
>>> factorization type QR" then you should configure PETSc with
>>> --download-suitesparse
>>> >
>>> > Jose
>>> >
>>> >
>>> > > El 26 sept 2022, a las 9:06, fujisan <fujisan43 at gmail.com> escribió:
>>> > >
>>> > > Thank you Pierre,
>>> > >
>>> > > I used PCNONE along with KSPLSQR and it worked.
>>> > > But as for PCQR, it cannot be found. There is nothing about it in
>>> the documentation as well.
>>> > >
>>> > > Fuji
>>> > >
>>> > > On Wed, Sep 21, 2022 at 12:20 PM Pierre Jolivet <pierre at joliv.et>
>>> wrote:
>>> > > Yes, but you need to use a KSP that handles rectangular Mat, such as
>>> KSPLSQR (-ksp_type lsqr).
>>> > > PCLU does not handle rectangular Pmat. The only PC that handle
>>> rectangular Pmat are PCQR, PCNONE.
>>> > > If you supply the normal equations as the Pmat for LSQR, then you
>>> can use “standard” PC.
>>> > > You can have a look at
>>> https://petsc.org/main/src/ksp/ksp/tutorials/ex27.c.html that covers
>>> most of these cases.
>>> > >
>>> > > Thanks,
>>> > > Pierre
>>> > >
>>> > > (sorry for the earlier answer sent wrongfully to petsc-maint, please
>>> discard the previous email)
>>> > >
>>> > >> On 21 Sep 2022, at 10:03 AM, fujisan <fujisan43 at gmail.com> wrote:
>>> > >>
>>> > >> I'm trying to solve Ax=b with a sparse rectangular matrix A (of
>>> size 33x17 in my test) using
>>> > >> options '-ksp_type stcg -pc_type lu' on 1 or 2 cpus.
>>> > >>
>>> > >> And I always get an error saying "Incompatible vector local
>>> lengths" (see below).
>>> > >>
>>> > >> Here is the relevant lines of my code:
>>> > >>
>>> > >> program test
>>> > >>     ...
>>> > >>     ! Variable declarations
>>> > >>
>>> > >>     PetscCallA(PetscInitialize(PETSC_NULL_CHARACTER,ierr))
>>> > >>
>>> > >>     PetscCall(MatCreate(PETSC_COMM_WORLD,A,ierr))
>>> > >>     PetscCall(MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,m,n,ierr))
>>> > >>     PetscCall(MatSetType(A,MATMPIAIJ,ierr))
>>> > >>     PetscCall(MatSetFromOptions(A,ierr))
>>> > >>     PetscCall(MatSetUp(A,ierr))
>>> > >>     PetscCall(MatGetOwnershipRange(A,istart,iend,ierr))
>>> > >>
>>> > >>     do irow=istart,iend-1
>>> > >>         ... Reading from file ...
>>> > >>
>>>  PetscCall(MatSetValues(A,1,irow,nzv,col,val,ADD_VALUES,ierr))
>>> > >>         ...
>>> > >>     enddo
>>> > >>
>>> > >>     PetscCall(MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr))
>>> > >>     PetscCall(MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr))
>>> > >>
>>> > >>     ! Creating vectors x and b
>>> > >>     PetscCallA(MatCreateVecs(A,x,b,ierr))
>>> > >>
>>> > >>     ! Duplicating x in u.
>>> > >>     PetscCallA(VecDuplicate(x,u,ierr))
>>> > >>
>>> > >>     ! u is used to calculate b
>>> > >>     PetscCallA(VecSet(u,1.0,ierr))
>>> > >>
>>> > >>     PetscCallA(VecAssemblyBegin(u,ierr))
>>> > >>     PetscCallA(VecAssemblyEnd(u,ierr))
>>> > >>
>>> > >>     ! Calculating Au = b
>>> > >>     PetscCallA(MatMult(A,u,b,ierr)) ! A.u = b
>>> > >>
>>> > >>     PetscCallA(KSPSetType(ksp,KSPCG,ierr))
>>> > >>
>>> > >>     PetscCallA(KSPSetOperators(ksp,A,A,ierr))
>>> > >>
>>> > >>     PetscCallA(KSPSetFromOptions(ksp,ierr))
>>> > >>
>>> > >>     ! Solving Ax = b, x unknown
>>> > >>     PetscCallA(KSPSolve(ksp,b,x,ierr))
>>> > >>
>>> > >>     PetscCallA(VecDestroy(x,ierr))
>>> > >>     PetscCallA(VecDestroy(u,ierr))
>>> > >>     PetscCallA(VecDestroy(b,ierr))
>>> > >>     PetscCallA(MatDestroy(A,ierr))
>>> > >>     PetscCallA(KSPDestroy(ksp,ierr))
>>> > >>
>>> > >>     call PetscFinalize(ierr)
>>> > >> end program
>>> > >>
>>> > >> The code reads a sparse matrix from a binary file.
>>> > >> I also output the sizes of matrix A and vectors b, x, u.
>>> > >> They all seem consistent.
>>> > >>
>>> > >> What am I doing wrong?
>>> > >> Is it possible to solve Ax=b with A rectangular?
>>> > >>
>>> > >> Thank you in advance for your help.
>>> > >> Have a nice day.
>>> > >>
>>> > >> Fuji
>>> > >>
>>> > >>  Matrix size : m=          33  n=          17  cpu size:
>>> 1
>>> > >>  Size of matrix A  :           33          17
>>> > >>  Size of vector b :           33
>>> > >>  Size of vector x :           17
>>> > >>  Size of vector u :           17
>>> > >> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> > >> [0]PETSC ERROR: Arguments are incompatible
>>> > >> [0]PETSC ERROR: Incompatible vector local lengths parameter # 1
>>> local size 33 != parameter # 2 local size 17
>>> > >> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble
>>> shooting.
>>> > >> [0]PETSC ERROR: Petsc Development GIT revision:
>>> v3.17.4-1341-g91b2b62a00  GIT Date: 2022-09-15 19:26:07 +0000
>>> > >> [0]PETSC ERROR: ./bin/solve on a x86_64 named master by fujisan Tue
>>> Sep 20 16:56:37 2022
>>> > >> [0]PETSC ERROR: Configure options --with-petsc-arch=x86_64
>>> --COPTFLAGS="-g -O3" --FOPTFLAGS="-g -O3" --CXXOPTFLAGS="-g -O3"
>>> --with-debugging=0 --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort
>>> --with-single-library=1 --with-mpiexec=mpiexec --with-precision=double
>>> --with-fortran-interfaces=1 --with-make=1 --with-mpi=1
>>> --with-mpi-compilers=1 --download-fblaslapack=0 --download-hypre=1
>>> --download-cmake=0 --with-cmake=1 --download-metis=1 --download-parmetis=1
>>> --download-ptscotch=0 --download-suitesparse=1 --download-triangle=1
>>> --download-superlu=1 --download-superlu_dist=1 --download-scalapack=1
>>> --download-mumps=1 --download-elemental=1 --download-spai=0
>>> --download-parms=1 --download-moab=1 --download-chaco=0 --download-fftw=1
>>> --with-petsc4py=1 --download-mpi4py=1 --download-saws
>>> --download-concurrencykit=1 --download-revolve=1 --download-cams=1
>>> --download-p4est=0 --with-zlib=1 --download-mfem=1 --download-glvis=0
>>> --with-opengl=0 --download-libpng=1 --download-libjpeg=1 --download-slepc=1
>>> --download-hpddm=1 --download-bamg=1 --download-mmg=0 --download-parmmg=0
>>> --download-htool=1 --download-egads=0 --download-opencascade=0
>>> PETSC_ARCH=x86_64
>>> > >> [0]PETSC ERROR: #1 VecCopy() at
>>> /data/softs/petsc/src/vec/vec/interface/vector.c:1607
>>> > >> [0]PETSC ERROR: #2 KSPSolve_BiCG() at
>>> /data/softs/petsc/src/ksp/ksp/impls/bicg/bicg.c:40
>>> > >> [0]PETSC ERROR: #3 KSPSolve_Private() at
>>> /data/softs/petsc/src/ksp/ksp/interface/itfunc.c:877
>>> > >> [0]PETSC ERROR: #4 KSPSolve() at
>>> /data/softs/petsc/src/ksp/ksp/interface/itfunc.c:1048
>>> > >> [0]PETSC ERROR: #5 solve.F90:218
>>> > >> Abort(75) on node 0 (rank 0 in comm 16): application called
>>> MPI_Abort(MPI_COMM_SELF, 75) - process 0
>>> > >>
>>> > >
>>> >
>>>
>>>
>>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220926/34a6abed/attachment-0001.html>


More information about the petsc-users mailing list