[petsc-dev] sacusp preconditioner - limited to seqaijcusp matrices?

Barry Smith bsmith at mcs.anl.gov
Wed Dec 14 14:48:39 CST 2011


  sounds like a bug, please fix.

   barry

On Dec 14, 2011, at 2:47 PM, Satish Balay wrote:

> On Wed, 14 Dec 2011, John Fettig wrote:
> 
>> On Tue, Sep 20, 2011 at 12:58 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>> 
>>> 
>>> On Sep 20, 2011, at 11:54 AM, Brad Aagaard wrote:
>>> 
>>>> I tried running a PyLith simulation using the sacusp preconditioner
>>> rather than ml. If I have the mat_type set to mpiaijcusp, I get an error
>>> message that sacusp preconditioner only works with CUSP matrices. I checked
>>> sacusp.c and the code checks to make sure the mat_type is seqaijcusp.
>>> 
>>>   Since the sacusp preconditioner is no MPI parallel it cannot deal with
>>> MPI matrices. ML is a truly (MPI) parallel multi-level preconditioner so it
>>> is not really interchangable with sacusp.
>>> 
>>>   You can use block Jacobi preconditioning -pc_type bjacobi -sub_pc_type
>>> sacusp   with sacusp on each block.
>> 
>> 
>> Hi Barry,
>> 
>> Sorry to dig up a very old thread, but I just tried your suggestion of
>> using block jacobi with sacusp on the blocks and I can't get it to work.
>> For example, I run src/ksp/ksp/examples/tutorials/ex2.c with:
>> 
>> mpirun -np 2 ./ex2 -ksp_type cg -pc_type bjacobi -sub_pc_type sacusp
>> -mat_type mpiaijcusp -vec_type mpicusp
>> 
>> and it still bombs on line 139 of sacusp.cu because the vector type is not
>> VECSEQCUSP.  What am I doing wrong?
> 
> 
> Looks like VecDuplicate() -> VecDuplicate_MPICUSP() is returning a
> 'mpi' Vec instead of 'mpicusp' Vec. [This is a bug?] will check..
> 
> Satish
> 




More information about the petsc-dev mailing list