want to make some changes to PCASM

Barry Smith bsmith at mcs.anl.gov
Fri Nov 21 07:27:42 CST 2008


    Richard,

     Yes supporting multiple processes for a single block would be a  
nice feature for both ASM and block Jacobi.

    Barry

On Nov 21, 2008, at 6:55 AM, Richard Tran Mills wrote:

> Folks,
>
> I believe this is orthogonal to what Lisandro is discussing, but I  
> am wondering if it might be worthwhile to add support in PCASM for  
> subdomains being shared across a subset of processors?  That way  
> subdomain solves could be shared across cores in a socket,  
> processors in an SMP node, etc.  All the talk about the  
> proliferation of cores in a socket, etc., suggests that this might  
> be worthwhile, although I'm not sure.  In practice, I've found that  
> for many of the problems I work with, one subdomain per processor  
> still works OK even with tens of thousands of processors.  Still,  
> this might be worth exploring.
>
> I have not looked into the PCASM code to see what might be involved  
> with this -- it's just an idle thought.  Can anyone more familiar  
> with the issues comment?
>
> --Richard
>
> Lisandro Dalcin wrote:
>> I would like to make some changes and enhancements in PCASM:
>> 1) Make it take full ownership (allocated array memory and IS
>> references) of the subdomain index sets.
>> 2) Currently, if the matrix is symmetric, ASM switches to type
>> PC_ASM_BASIC in PCSetFromOptions_ASM().
>> I believe this should also be handled on PCSetup_ASM(). If not, we  
>> get
>> different behavior if PCSetFromOptions() is never called.
>> Moreover, I believe PCSetFromOptions_ASM() should call PCASMSetType()
>> if the '-pc_asm_type' option is passed.
>> 3) If more than one block per processor is requested, then ASM
>> currently does a row-based partitioning. For unstructured problems  
>> and
>> LU-based local solves, this is going to be really bad, right?
>> I've already implemented an 'smart' subdomain partitioner based on
>> MatPartitioningXXX. I'm thinking about adding a new utility routine
>> (in the spirit of PCASMCreateSubdomains2d)
>> PCASMCreateSubdomains(Mat A, MatPartitioningType mptype, PetscInt
>> local_blocks, IS *is[])
>> which could be called in order to get a good subdomain partitioning,
>> and next pass it to PCASMSetLocalSubdomains().
>> Does all this make sense?
>
>
> -- 
> Richard Tran Mills, Ph.D.            |   E-mail: rmills at climate.ornl.gov
> Computational Scientist              |   Phone:  (865) 241-3198
> Computational Earth Sciences Group   |   Fax:    (865) 574-0405
> Oak Ridge National Laboratory        |   http://climate.ornl.gov/~rmills
>




More information about the petsc-dev mailing list