[petsc-users] reusing LU factorization?

David Liu daveliu at mit.edu
Tue Jan 28 17:23:34 CST 2014


wow that is news to me. I always assumed that this is normal.

I'm pretty certain it's not the preallocation. I'm using MatCreateMPI, and
to my knowledge I wouldn't even be able to set the values without crashing
if I didn't preallocate. (If I'm not mistaken, the setting values slowly
without preallocating is only possible if you create the Mat using
MatCreate + MatSetup).

Also, I'm certain that the time is taken on the first solve, not the
setting of values, because I use the matrix in a MatMult first to get the
RHS before solving, and the MatMult happens before the long first solve.


On Tue, Jan 28, 2014 at 5:04 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
> On Jan 28, 2014, at 1:36 PM, David Liu <daveliu at mit.edu> wrote:
>
> > Hi, I'm writing an application that solves a sparse matrix many times
> using Pastix. I notice that the first solves takes a very long time,
>
>   Is it the first "solve" or the first time you put values into that
> matrix that "takes a long time"? If you are not properly preallocating the
> matrix then the initial setting of values will be slow and waste memory.
>  See
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatXAIJSetPreallocation.html
>
>   The symbolic factorization is usually much faster than a numeric
> factorization so that is not the cause of the slow "first solve".
>
>    Barry
>
>
>
> > while the subsequent solves are very fast. I don't fully understand
> what's going on behind the curtains, but I'm guessing it's because the very
> first solve has to read in the non-zero structure for the LU factorization,
> while the subsequent solves are faster because the nonzero structure
> doesn't change.
> >
> > My question is, is there any way to save the information obtained from
> the very first solve, so that the next time I run the application, the very
> first solve can be fast too (provided that I still have the same nonzero
> structure)?
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140128/18667133/attachment.html>


More information about the petsc-users mailing list