mesh ordering and partition

Matthew Knepley knepley at
Fri May 16 12:37:52 CDT 2008

On Fri, May 16, 2008 at 11:47 AM, Gong Ding <gdiso at> wrote:
> ----- Original Message ----- From: "Matthew Knepley" <knepley at>
> To: <petsc-users at>
> Sent: Saturday, May 17, 2008 12:19 AM
> Subject: Re: mesh ordering and partition
>> I think, if you are using the serial PETSc ILU, you should just use a
>> MatOrdering,
>> which can be done from the command line:
>>  -pc_type ilu -pc_factor_mat_ordering_type rcm
>> which I tested on KSP ex2.
>>  Matt
> I am developing parallel code for 3D semiconductor device simulation.
> From the experience of 2D code, the GMRES solver with ILU works well (the
> matrix is asymmetric.)
> As a result, I'd like to use GMRES+ILU again for 3D,  in parallel.
> Does   -pc_type ilu -pc_factor_mat_ordering_type rcm still work?
> Since the parallel martrix requires continuous index in subdomain, the
> matrix ordering seems troublesome.
> maybe only a local ordering can be done... Am I right?

Its a local ordering. Remember that Block-Jacobi ILU is a LOT worse than
serial ILU. I would not expect it to scale very well. You can try ASM to fix it
up, but there are no guarantees.


> Gong Ding

What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which
their experiments lead.
-- Norbert Wiener

More information about the petsc-users mailing list