[petsc-users] parallel sor

Jed Brown jed at jedbrown.org
Mon Feb 17 21:23:00 CST 2014


Xiangdong <epscodes at gmail.com> writes:

> Hello everyone,
>
> As in the note of PCSOR, it corresponds to block Jacobi with SOR on each
> block, not a true parallel SOR.
>
> 1) Is there a parallel sor implemented in PETSc, like the red-black Gauss
> Seidel?

Coloring for all but the simplest operators produces a huge number of
colors, especially in 3D.  Colored operators use bandwidth less
efficiently, thus tend to deliver poor performance.

> 2) If it is not a true parallel SOR, then what is the difference between
> options  -pc_sor_forward and -pc_sor_local_forward? It seems that
> pc_sor_forward does not work for mpiaij format. When is pc_sor_forward used?

SOR_FORWARD_SWEEP is currently only used for serial SOR.  MPI matrices
use SOR_LOCAL_FORWARD_SWEEP to make it explicit that this is not "true"
SOR.  A future implementation (e.g., of Mark Adams' algorithm) would use
SOR_FORWARD_SWEEP.

If you think the local SOR is a problem, compare running the same
problem in serial.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 835 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140217/7534f03a/attachment.pgp>


More information about the petsc-users mailing list