matrix assembling time

Ravi Kannan rxk at cfdrc.com
Fri Mar 13 16:39:25 CDT 2009


Hi Matt

Are you suggesting to use MatGetOrdering()?
Will it work for parallel matrix?

Thanks.

Ravi
  -----Original Message-----
  From: petsc-users-bounces at mcs.anl.gov
[mailto:petsc-users-bounces at mcs.anl.gov]On Behalf Of Matthew Knepley
  Sent: Friday, March 13, 2009 11:34 AM
  To: PETSc users list
  Subject: Re: matrix assembling time


  On Fri, Mar 13, 2009 at 12:48 PM, Ravi Kannan <rxk at cfdrc.com> wrote:

    Hi,
       This is Ravi Kannan from CFD Research Corporation. One basic question
on
    the ordering of linear solvers in PETSc: If my A matrix (in AX=B) is a
    sparse matrix and the bandwidth of A (i.e. the distance between non zero
    elements) is high, does PETSc reorder the matrix/matrix-equations so as
to
    solve more efficiently. If yes, is there any specific command to do the
    above?

  You can reorder the matrix using the MatOrdering class.

    Matt


    Thanks
    Ravi



    -----Original Message-----
    From: petsc-users-bounces at mcs.anl.gov
    [mailto:petsc-users-bounces at mcs.anl.gov]On Behalf Of Yixun Liu
    Sent: Friday, March 06, 2009 12:50 PM
    To: PETSC
    Subject: matrix assembling time


    Hi,
    Using PETSc the assembling time for a mesh with 6000 vertices  is about
    14 second parallelized on 4 processors, but another sequential program
    based on gmm lib is about 0.6 second. PETSc's solver is much faster than
    gmm, but I don't know why its assembling is so slow although I have
    preallocate an enough space for the matrix.

    MatMPIAIJSetPreallocation(sparseMeshMechanicalStiffnessMatrix, 1000,
    PETSC_NULL, 1000, PETSC_NULL);

    Yixun






  --
  What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
  -- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20090313/494389b6/attachment.htm>


More information about the petsc-users mailing list