[petsc-users] Avoid MUMPS ordering for distributed matrices?

Jose E. Roman jroman at dsic.upv.es
Wed Nov 23 09:51:46 CST 2011


El 23/11/2011, a las 16:25, Barry Smith escribió:

> 
> On Nov 23, 2011, at 7:18 AM, Jed Brown wrote:
> 
>> On Wed, Nov 23, 2011 at 05:43, Dominik Szczerba <dominik at itis.ethz.ch> wrote:
>> In my procedure considerable time is spent to partition the domain.
>> When using MUMPS as a solver for my matrix I see the message:
>> 
>> "Ordering based on METIS"
>> 
>> This is an ordering to reduce fill in factorization, not to to partition the domain. Last I heard, symbolic factorization was done in serial, which explains why you find it taking a lot of time. The right hand side and solution vectors are also passed on rank 0, which presents another inefficiency/imbalance and memory bottleneck. Talk to the MUMPS developers or use a different package if you don't like these properties.
> 
>   Mumps does now have an option of parallel ordering. 
> 
>   Run with -help and look at the options like 
> 
> -mat_mumps_icntl_28","ICNTL(28): use 1 for sequential analysis and ictnl(7) ordering, or 2 for parallel analysis and ictnl(29) ordering
> 
> -mat_mumps_icntl_29","ICNTL(29): parallel ordering 1 = ptscotch 2 = parmetis
> 
> 
> I apologize that the options are organized in such a silly way but that is how MUMPS is organized.
> 
>   Barry
> 

I have a question related to this. We tried today (with petsc-dev)
-mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2
and it works, but 
-mat_mumps_icntl_28 2 -mat_mumps_icntl_29 1
gives an error of MUMPS complaining that PTScotch was not enabled.

Should the combination MUMPS+PTScotch work in petsc-dev? We did --download-mumps --download-ptscotch --download-parmetis

Jose




More information about the petsc-users mailing list