[petsc-users] mumps solve with same nonzero pattern
Hong Zhang
hzhang at mcs.anl.gov
Fri Apr 27 14:09:22 CDT 2012
Wen :
> Thanks for your reply. I also tested the exactly same problem with
> SUPERLU. The mumps and superlu dist both are using sequential symbolic
> factorization. However, superlu dist takes only 20 seconds but mumps takes
> almost 700 seconds. I am wondering whether such a big difference is
> possible. Do those two direct solver use quite different algorithm?
>
This is weird. Try
1) increase work space with
-mat_mumps_icntl_14 50 (default is 20)
2) different matrix orderings with
-mat_mumps_icntl_7 2 (or number from 0 to 6)
Run your code with '-log_summary' and see which routine causes this huge
difference.
Hong
>
> And also since I might have the same nonzero structure system to be solved
> many times at different places. I am wondering whether I could save the
> symbolic factorization output somewhere and then read them as the input for
> future solving. Thanks.
>
> Regards,
> Wen
>
>
>
> On Fri, Apr 27, 2012 at 1:06 PM, <petsc-users-request at mcs.anl.gov> wrote:
>
>> Send petsc-users mailing list submissions to
>> petsc-users at mcs.anl.gov
>>
>> To subscribe or unsubscribe via the World Wide Web, visit
>> https://lists.mcs.anl.gov/mailman/listinfo/petsc-users
>> or, via email, send a message with subject or body 'help' to
>> petsc-users-request at mcs.anl.gov
>>
>> You can reach the person managing the list at
>> petsc-users-owner at mcs.anl.gov
>>
>> When replying, please edit your Subject line so it is more specific
>> than "Re: Contents of petsc-users digest..."
>>
>>
>> Today's Topics:
>>
>> 1. mumps solve with same nonzero pattern (Wen Jiang)
>> 2. Re: mumps solve with same nonzero pattern (Matthew Knepley)
>> 3. High Re CFD method. (Christian Klettner)
>> 4. Re: writing native PETSc binaries in python
>> (Ataollah Mesgarnejad)
>>
>>
>> ----------------------------------------------------------------------
>>
>> Message: 1
>> Date: Fri, 27 Apr 2012 09:53:57 -0400
>> From: Wen Jiang <jiangwen84 at gmail.com>
>> Subject: [petsc-users] mumps solve with same nonzero pattern
>> To: petsc-users at mcs.anl.gov
>> Message-ID:
>> <CAMJxm+D7kwTZPQ_7xwEKueL2z=
>> zKL6a4A1BBTQXxACj_UKJF-Q at mail.gmail.com>
>> Content-Type: text/plain; charset="iso-8859-1"
>>
>>
>> Hi,
>>
>> I am using mumps in PETSc to solve a 0.2 million system on 128 cores.
>> Within Newton-Raphson iteration, the system with same sparse structure but
>> different values will be solved many times. And I specify the
>> preconditioning matrix having same nonzero pattern. The first time solving
>> costs around 900 seconds and later solving only takes around 200 seconds.
>> So I am wondering why the time differs that much. By setting the same
>> nonzero pattern of pc, which of the mumps control parameters does PETSc
>> change? Thanks.
>>
>> Regards,
>> Wen
>> -------------- next part --------------
>> An HTML attachment was scrubbed...
>> URL: <
>> http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120427/7bd8e1ad/attachment-0001.htm
>> >
>>
>> ------------------------------
>>
>> Message: 2
>> Date: Fri, 27 Apr 2012 10:11:27 -0400
>> From: Matthew Knepley <knepley at gmail.com>
>> Subject: Re: [petsc-users] mumps solve with same nonzero pattern
>> To: PETSc users list <petsc-users at mcs.anl.gov>
>> Message-ID:
>> <CAMYG4GnbKCA7DOXAP=_+
>> YHkDWaDF94TFjqr2eaZ3dQnD7Vy_Dg at mail.gmail.com>
>> Content-Type: text/plain; charset="iso-8859-1"
>>
>>
>> On Fri, Apr 27, 2012 at 9:53 AM, Wen Jiang <jiangwen84 at gmail.com> wrote:
>>
>> > Hi,
>> >
>> > I am using mumps in PETSc to solve a 0.2 million system on 128 cores.
>> > Within Newton-Raphson iteration, the system with same sparse structure
>> but
>> > different values will be solved many times. And I specify the
>> > preconditioning matrix having same nonzero pattern. The first time
>> solving
>> > costs around 900 seconds and later solving only takes around 200
>> seconds.
>> > So I am wondering why the time differs that much. By setting the same
>> > nonzero pattern of pc, which of the mumps control parameters does PETSc
>> > change? Thanks.
>> >
>>
>> The difference is that you do not have to perform the symbolic
>> factorization again if the nonzero pattern does not change,
>> just the numeric factorization. The symbolic factorization is the costly
>> step.
>>
>> Matt
>>
>>
>> > Regards,
>> > Wen
>> >
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>> -------------- next part --------------
>> An HTML attachment was scrubbed...
>> URL: <
>> http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120427/f148b130/attachment-0001.htm
>> >
>>
>> ------------------------------
>>
>> Message: 3
>> Date: Fri, 27 Apr 2012 11:26:33 +0100
>> From: "Christian Klettner" <ucemckl at ucl.ac.uk>
>> Subject: [petsc-users] High Re CFD method.
>> To: petsc-users at mcs.anl.gov
>> Message-ID:
>> <
>> ba0801c9e3bd11beb658cbec10d24a7f.squirrel at www.squirrelmail.ucl.ac.uk>
>> Content-Type: text/plain;charset=iso-8859-1
>>
>> Dear Petsc users,
>>
>> I know this is off topic but the people reading these questions are well
>> placed to answer the question, being actual CFD developers. Over the last
>> five years our group has developed a CFD code (using PETSc for parallel
>> and vectors) based on the characteristic based split scheme to solve the
>> incompressible Navier-Stokes equation (2D-3D) in a finite element
>> framework using unstructured meshes. It is first order accurate in time
>> and space. This has proved effective for high resolution, low Reynolds
>> number complex geometry flows (e.g. groups of bodies).
>>
>> We are hoping to move onto higher Reynolds number flows, with the
>> intention of using a large eddy simulation model for the turbulence. The
>> goal is to model the flow in a hospital room with typical Reynolds numbers
>> of 10^5. At present we are not convinced that the CBS scheme is the best
>> scheme for these types of flows and are looking for other peoples opinions
>> on alternative methods. Does anyone have any advice in this direction?
>>
>> Best regards,
>> Christian
>>
>>
>>
>> ------------------------------
>>
>> Message: 4
>> Date: Fri, 27 Apr 2012 10:28:41 -0500
>> From: Ataollah Mesgarnejad <amesga1 at tigers.lsu.edu>
>> Subject: Re: [petsc-users] writing native PETSc binaries in python
>> To: PETSc users list <petsc-users at mcs.anl.gov>
>> Cc: petsc-dev at mcs.anl.gov
>> Message-ID:
>> <
>> CAC+VmGdP1V3DFrfLWextGVU23NRHtKwUoqcD4Ai__mwTdd1ZGw at mail.gmail.com>
>> Content-Type: text/plain; charset="iso-8859-1"
>>
>> Thanks for the replies:
>>
>> Just a bug report with petsc4py:
>>
>> petsc4py which downloads with --download-petsc4py=1 is the older v1.1.1
>> which doesn't compile with petsc-dev?
>>
>> Thanks,
>> Ata
>>
>> On Fri, Apr 27, 2012 at 8:25 AM, Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>> > On Fri, Apr 27, 2012 at 9:19 AM, Ataollah Mesgarnejad <
>> > amesga1 at tigers.lsu.edu> wrote:
>> >
>> >> Dear all,
>> >>
>> >> I was wondering if there is way to write numpy arrays as native PETSc
>> >> binaries in python so I can open them inside my program with VecView?
>> >>
>> >
>> > I think what you want is to create a Vec using that array in petsc4py.
>> >
>> > Matt
>> >
>> >
>> >> Thanks,
>> >> Ata
>>
>> >>
>> >
>> >
>> >
>> > --
>> > What most experimenters take for granted before they begin their
>> > experiments is infinitely more interesting than any results to which
>> their
>> > experiments lead.
>> > -- Norbert Wiener
>> >
>>
>>
>>
>> --
>> A. Mesgarnejad
>> PhD Student, Research Assistant
>> Mechanical Engineering Department
>> Louisiana State University
>> 2203 Patrick F. Taylor Hall
>> Baton Rouge, La 70803
>> -------------- next part --------------
>> An HTML attachment was scrubbed...
>> URL: <
>> http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120427/cca3e1b6/attachment-0001.htm
>> >
>>
>> ------------------------------
>>
>> _______________________________________________
>> petsc-users mailing list
>> petsc-users at mcs.anl.gov
>> https://lists.mcs.anl.gov/mailman/listinfo/petsc-users
>>
>>
>> End of petsc-users Digest, Vol 40, Issue 102
>> ********************************************
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120427/071853c6/attachment.htm>
More information about the petsc-users
mailing list