Using ICC for MPISBAIJ?

Hong Zhang hzhang at mcs.anl.gov
Mon Feb 12 23:17:33 CST 2007



> Thank you very much for the help you gave me in tuning
> my code. I now think it is important for us to take
> advantage of the symmetric positive definiteness
> property of our Matrix, i.e., we should use the
> conjugate gradient (CG) method with incomplete
> Cholesky decomposition (ICC) as the pre-conditioner (I
> assume this is commonly accepted at least for serial
> computation, right?).
Yes.

> However, I am surprised and disappointed to realize
> that the -pc_type icc option only exists for seqsbaij
> Matrices. In order to parallelize the linear solver, I

icc also works for seqaij type, which enables more efficient
data accessing than seqsbaij.

> have to use the external package BlockSolve95.
> I took a look at this package at
> http://www-unix.mcs.anl.gov/sumaa3d/BlockSolve/
> I am very disappointed to see it hasn't been in
> development ever since 1997. I am worried it does not
> provide a state-of-art performance.
>
> Nevertheless, I gave it a try. The package is not as
> easy to build as common linux software (even much
> worse than Petsc), especially according their REAME,
> it is unknown to work with linux. However, by
> hand-editing the bmake/linux/linux.site file, I seemed
> to be able to build the library. However, the examples
> doesn't build and the PETSC built with BlockSolve95
> gives me errors in linking like:
> undefined referece to "dgemv_" and "dgetrf_".

This seems relates to linking lapack. Satish might knows
about it.

>
> In another place of the PETSC mannul, I found there is
> another external package "Spooles" that can also be
> used  with mpisbaij and Cholesky PC. But it is also
> dated in 1999.

Spooles is sparse direct solver. Although it has been
out of support since 99, we find it is still in good quality,
especially it has good robustness and portability.
Petsc also interfaces with other well-maintained sparse
direct solvers, e.g., mumps and superlu_dist. When matrices are
in the order of 100k or less and ill-conditioned, the direct solvers
are good choices.
>
> Could anyone give me some advice what is the best way
> to go to solve a large sparse symmetric positive
> definite linux system efficiently using MPI on a
> cluster?

The performance is application dependant. Petsc allows you
testing various algorithms at runtime.
Use '-help' to see all possible options. Run your application
with '-log_summary' to collect and compare performance data.

Good luck,

Hong
>
>
>
> ____________________________________________________________________________________
> Don't get soaked.  Take a quick peak at the forecast
> with the Yahoo! Search weather shortcut.
> http://tools.search.yahoo.com/shortcuts/#loc_weather
>
>




More information about the petsc-users mailing list