Using ICC for MPISBAIJ?

Hong Zhang hzhang at mcs.anl.gov
Mon Feb 12 23:22:48 CST 2007


I forget to tell you that you can use parallel
CG with block-jacobi, and sequential icc within the
diagonal blocks. Example, run
src/ksp/ksp/examples/tutorials/ex5 with
mpirun -np 2 ./ex5 -ksp_type cg -pc_type bjacobi -sub_pc_type icc
-ksp_view

Use '-help' to get many options on icc.

Hong

On Mon, 12 Feb 2007, Shi Jin wrote:

> Hi All,
>
> Thank you very much for the help you gave me in tuning
> my code. I now think it is important for us to take
> advantage of the symmetric positive definiteness
> property of our Matrix, i.e., we should use the
> conjugate gradient (CG) method with incomplete
> Cholesky decomposition (ICC) as the pre-conditioner (I
> assume this is commonly accepted at least for serial
> computation, right?).
> However, I am surprised and disappointed to realize
> that the -pc_type icc option only exists for seqsbaij
> Matrices. In order to parallelize the linear solver, I
> have to use the external package BlockSolve95.
> I took a look at this package at
> http://www-unix.mcs.anl.gov/sumaa3d/BlockSolve/
> I am very disappointed to see it hasn't been in
> development ever since 1997. I am worried it does not
> provide a state-of-art performance.
>
> Nevertheless, I gave it a try. The package is not as
> easy to build as common linux software (even much
> worse than Petsc), especially according their REAME,
> it is unknown to work with linux. However, by
> hand-editing the bmake/linux/linux.site file, I seemed
> to be able to build the library. However, the examples
> doesn't build and the PETSC built with BlockSolve95
> gives me errors in linking like:
> undefined referece to "dgemv_" and "dgetrf_".
>
> In another place of the PETSC mannul, I found there is
> another external package "Spooles" that can also be
> used  with mpisbaij and Cholesky PC. But it is also
> dated in 1999.
>
> Could anyone give me some advice what is the best way
> to go to solve a large sparse symmetric positive
> definite linux system efficiently using MPI on a
> cluster?
>
> Thank you very much.
> Shi
>
>
>
>
> ____________________________________________________________________________________
> Don't get soaked.  Take a quick peak at the forecast
> with the Yahoo! Search weather shortcut.
> http://tools.search.yahoo.com/shortcuts/#loc_weather
>
>




More information about the petsc-users mailing list