[petsc-dev] sacusp preconditioner - limited to seqaijcusp matrices?

John Fettig john.fettig at gmail.com
Wed Dec 14 14:44:34 CST 2011


On Wed, Dec 14, 2011 at 3:24 PM, Matthew Knepley <knepley at gmail.com> wrote:

> On Wed, Dec 14, 2011 at 2:21 PM, John Fettig <john.fettig at gmail.com>wrote:
>
>> On Tue, Sep 20, 2011 at 12:58 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>
>>>
>>> On Sep 20, 2011, at 11:54 AM, Brad Aagaard wrote:
>>>
>>> > I tried running a PyLith simulation using the sacusp preconditioner
>>> rather than ml. If I have the mat_type set to mpiaijcusp, I get an error
>>> message that sacusp preconditioner only works with CUSP matrices. I checked
>>> sacusp.c and the code checks to make sure the mat_type is seqaijcusp.
>>>
>>>    Since the sacusp preconditioner is no MPI parallel it cannot deal
>>> with MPI matrices. ML is a truly (MPI) parallel multi-level preconditioner
>>> so it is not really interchangable with sacusp.
>>>
>>>    You can use block Jacobi preconditioning -pc_type bjacobi
>>> -sub_pc_type sacusp   with sacusp on each block.
>>
>>
>> Hi Barry,
>>
>> Sorry to dig up a very old thread, but I just tried your suggestion of
>> using block jacobi with sacusp on the blocks and I can't get it to work.
>> For example, I run src/ksp/ksp/examples/tutorials/ex2.c with:
>>
>>  mpirun -np 2 ./ex2 -ksp_type cg -pc_type bjacobi -sub_pc_type sacusp
>> -mat_type mpiaijcusp -vec_type mpicusp
>>
>> and it still bombs on line 139 of sacusp.cu because the vector type is
>> not VECSEQCUSP.  What am I doing wrong?
>>
>
> You might need -vec_type cusp (should give seqcusp on 1 proc)
>

Ok, I tried using -vec_type cusp, same error.  I'm running on 2 procs, so
this still results in mpicusp doesn't it?

John
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20111214/ad2ed601/attachment.html>


More information about the petsc-dev mailing list