[petsc-users] superlu_dist and MatSolveTranspose

Gaetan Kenway gaetank at gmail.com
Fri Aug 29 18:14:15 CDT 2014


Hi Antoine

We are also using PETSc for solving adjoint systems resulting from CFD. To
get around the matSolveTranspose issue we just assemble the transpose
matrix directly and then call KSPSolve(). If this is possible in your
application I think it is probably the best approach

Gaetan


On Fri, Aug 29, 2014 at 3:58 PM, Antoine De Blois <
antoine.deblois at aero.bombardier.com> wrote:

> Hello Jed,
>
> Thank you for your quick response. So I spent some time to dig deeper into
> my problem. I coded a shell script that passes through a bunch of ksp_type,
> pc_type and sub_pc_type. So please disregard the comment about the "does
> not converge properly for transpose". I had taken that conclusion from my
> own code (and not from the ex10 and extracted matrix), and a
> KSPSetFromOptions was missing. Apologies for that.
>
> What remains is the performance issue. The MatSolveTranspose takes a very
> long time to converge. For a matrix of 3 million rows, MatSolveTranspose
> takes roughly 5 minutes on 64 cpus, whereas the MatSolve is almost
> instantaneous!. When I gdb my code, petsc seems to be stalled in the
> MatLUFactorNumeric_SeqAIJ_Inode () for a long time. I also did a top on the
> compute node to check the RAM usage. It was hovering over 2 gig, so memory
> usage does not seem to be an issue here.
>
> #0  0x00002afe8dfebd08 in MatLUFactorNumeric_SeqAIJ_Inode ()
>    from
> /gpfs/fs2/aero/SOFTWARE/FLOW_SOLVERS/FANSC/EXT_LIB/petsc-3.5.1/lib/libpetsc.so.3.5
> #1  0x00002afe8e07f15c in MatLUFactorNumeric ()
>    from
> /gpfs/fs2/aero/SOFTWARE/FLOW_SOLVERS/FANSC/EXT_LIB/petsc-3.5.1/lib/libpetsc.so.3.5
> #2  0x00002afe8e2afa99 in PCSetUp_ILU ()
>    from
> /gpfs/fs2/aero/SOFTWARE/FLOW_SOLVERS/FANSC/EXT_LIB/petsc-3.5.1/lib/libpetsc.so.3.5
> #3  0x00002afe8e337c0d in PCSetUp ()
>    from
> /gpfs/fs2/aero/SOFTWARE/FLOW_SOLVERS/FANSC/EXT_LIB/petsc-3.5.1/lib/libpetsc.so.3.5
> #4  0x00002afe8e39d643 in KSPSetUp ()
>    from
> /gpfs/fs2/aero/SOFTWARE/FLOW_SOLVERS/FANSC/EXT_LIB/petsc-3.5.1/lib/libpetsc.so.3.5
> #5  0x00002afe8e39e3ee in KSPSolveTranspose ()
>    from
> /gpfs/fs2/aero/SOFTWARE/FLOW_SOLVERS/FANSC/EXT_LIB/petsc-3.5.1/lib/libpetsc.so.3.5
> #6  0x00002afe8e300f8c in PCApplyTranspose_ASM ()
>    from
> /gpfs/fs2/aero/SOFTWARE/FLOW_SOLVERS/FANSC/EXT_LIB/petsc-3.5.1/lib/libpetsc.so.3.5
> #7  0x00002afe8e338c13 in PCApplyTranspose ()
>    from
> /gpfs/fs2/aero/SOFTWARE/FLOW_SOLVERS/FANSC/EXT_LIB/petsc-3.5.1/lib/libpetsc.so.3.5
> #8  0x00002afe8e3a8a84 in KSPInitialResidual ()
>    from
> /gpfs/fs2/aero/SOFTWARE/FLOW_SOLVERS/FANSC/EXT_LIB/petsc-3.5.1/lib/libpetsc.so.3.5
> #9  0x00002afe8e376c32 in KSPSolve_GMRES ()
>    from
> /gpfs/fs2/aero/SOFTWARE/FLOW_SOLVERS/FANSC/EXT_LIB/petsc-3.5.1/lib/libpetsc.so.3.5
> #10 0x00002afe8e39e425 in KSPSolveTranspose ()
>
> For that particular application, I was using:
> ksp_type:                       gmres
> pc_type:                        asm
> sub_pc_type:                    ilu
> adj_sub_pc_factor_levels        1
>
> For small matrices, the MatSolveTranspose computing time is very similar
> to the simple MatSolve.
>
> And if I want to revert to a MatTranspose followed by the MatSolve, then
> the MatTranspose takes forever to finish... For a matrix of 3 million rows,
> MatTranspose takes 30 minutes on 64 cpus!!
>
> So thank you for implementing the transpose solve in superlu_dist. It
> would also be nice to have it with hypre.
> Let me know what you think and ideas on how to improve my computational
> time,
> Regards,
> Antoine
>
> -----Message d'origine-----
> De : Jed Brown [mailto:jed at jedbrown.org]
> Envoyé : Thursday, August 28, 2014 5:01 PM
> À : Antoine De Blois; 'petsc-users at mcs.anl.gov'
> Objet : Re: [petsc-users] superlu_dist and MatSolveTranspose
>
> Antoine De Blois <antoine.deblois at aero.bombardier.com> writes:
>
> > Hello everyone,
> >
> > I am trying to solve a A^T x = b system. For my applications, I had
> > realized that the MatSolveTranspose does not converge properly.
>
> What do you mean "does not converge properly"?  Can you send a test case
> where the transpose solve should be equivalent, but is not?  We have only a
> few tests for transpose solve and not all preconditioners support it, but
> where it is supported, we want to ensure that it is correct.
>
> > Therefore, I had implemented a MatTranspose followed by a MatSolve.
> > This proved to converge perfectly (which is strange since the
> > transposed matrix has the same eigenvalues as the untransposed...).
> > The problem is that for bigger matrices, the MatTranspose is very
> > costly and thus cannot be used.
>
> Costly in terms of memory?  (I want you to be able to use
> KSPSolveTranspose, but I'm curious what you're experiencing.)
>
> > I tried using the superlu_dist package. Although it the package works
> perfectly for the MatSolve, I get the an "No support for this operation for
> this object type" error  with MatSolveTransopse. I reproduced the error
> using the MatView an ex10 tutorial. I can provide the matrix and rhs upon
> request. My command line was:
> >
> > ex10 -f0 A_and_rhs.bin -pc_type lu -pc_factor_mat_solver_package
> > superlu_dist -trans
> >
> > So it there an additional parameter I need to use for the transposed
> solve?
> >
> > [0]PETSC ERROR: --------------------- Error Message
> > --------------------------------------------------------------
> > [0]PETSC ERROR: No support for this operation for this object type
> > [0]PETSC ERROR: Matrix type mpiaij
>
> This is easy to add.  I'll do it now.
>
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.5.1, unknown [0]PETSC ERROR:
> > /gpfs/fs1/aero/SOFTWARE/TOOLS/PROGRAMMING/petsc/src/ksp/ksp/examples/t
> > utorials/ex10 on a ARGUS_impi_opt named hpc-user11 by ad007804 Thu Aug
> > 28 16:41:15 2014 [0]PETSC ERROR: Configure options --CFLAGS="-xHost
> > -axAVX" --download-hypre --download-metis --download-ml
> > --download-parmetis --download-scalapack --download-superlu_dist
> > --download-mumps --with-c2html=0 --with-cc=mpiicc --with-fc=mpiifort
> > --with-cxx=mpiicpc --with-debugging=yes
> > --prefix=/gpfs/fs1/aero/SOFTWARE/TOOLS/PROGRAMMING/petsc/petsc-3.5.1
> > --with-cmake=/gpfs/fs1/aero/SOFTWARE/TOOLS/CMAKE/cmake-2.8.7/bin/cmake
> > --with-valgrind=/gpfs/fs1/aero/SOFTWARE/TOOLS/PROGRAMMING/valgrind-3.9
> > .0/bin/valgrind --with-shared-libraries=0 [0]PETSC ERROR: #1
> > MatSolveTranspose() line 3473 in
> > /gpfs/fs2/aero/SOFTWARE/TOOLS/PROGRAMMING/petsc/src/mat/interface/matr
> > ix.c [0]PETSC ERROR: #2 PCApplyTranspose_LU() line 214 in
> > /gpfs/fs2/aero/SOFTWARE/TOOLS/PROGRAMMING/petsc/src/ksp/pc/impls/facto
> > r/lu/lu.c [0]PETSC ERROR: #3 PCApplyTranspose() line 573 in
> > /gpfs/fs2/aero/SOFTWARE/TOOLS/PROGRAMMING/petsc/src/ksp/pc/interface/p
> > recon.c [0]PETSC ERROR: #4 KSP_PCApply() line 233 in
> > /gpfs/fs2/aero/SOFTWARE/TOOLS/PROGRAMMING/petsc/include/petsc-private/
> > kspimpl.h [0]PETSC ERROR: #5 KSPInitialResidual() line 63 in
> > /gpfs/fs2/aero/SOFTWARE/TOOLS/PROGRAMMING/petsc/src/ksp/ksp/interface/
> > itres.c [0]PETSC ERROR: #6 KSPSolve_GMRES() line 234 in
> > /gpfs/fs2/aero/SOFTWARE/TOOLS/PROGRAMMING/petsc/src/ksp/ksp/impls/gmre
> > s/gmres.c [0]PETSC ERROR: #7 KSPSolveTranspose() line 704 in
> > /gpfs/fs2/aero/SOFTWARE/TOOLS/PROGRAMMING/petsc/src/ksp/ksp/interface/
> > itfunc.c [0]PETSC ERROR: #8 main() line 324 in
> > /gpfs/fs1/aero/SOFTWARE/TOOLS/PROGRAMMING/petsc/src/ksp/ksp/examples/t
> > utorials/ex10.c
> >
> > FYI, the transpose solve is a typical application for adjoint
> optimization. There should be a big adjoint community of developers that
> try to solve the transposed matrix.
> >
> > Any help is much appreciated,
> > Best,
> > Antoine
> >
> >
> > Antoine DeBlois
> > Specialiste ingenierie, MDO lead / Engineering Specialist, MDO lead
> > Aéronautique / Aerospace 514-855-5001, x 50862
> > antoine.deblois at aero.bombardier.com<mailto:antoine.deblois at aero.bombar
> > dier.com>
> >
> > 2351 Blvd Alfred-Nobel
> > Montreal, Qc
> > H4S 1A9
> >
> > [Description : Description :
> > http://signatures.ca.aero.bombardier.net/eom_logo_164x39_fr.jpg]
> > CONFIDENTIALITY NOTICE - This communication may contain privileged or
> confidential information.
> > If you are not the intended recipient or received this communication
> > by error, please notify the sender and delete the message without
> > copying
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140829/2eafd705/attachment-0001.html>


More information about the petsc-users mailing list