[petsc-dev] petsc-dev fails test with debugging=0

Andrea Lani andrea.lani at gmail.com
Mon Jan 20 01:59:48 CST 2014


This sounds good!

In my original e-mail I was also pointing (1) a test failure of the master in optimized mode and (2) another (possible) problem: 

is PCASM supposed to be working, right now, in combination with GPU-based KSPGMRES on multi-GPU? For me, it doesn't. Note, the result is correct on single GPU but not for more.

Would it be possible to make PETSc abort after printing some "consistency-checking" messages from inside the code when somebody like me attempts to use "forbidden" or "not-yet supported" scenarios for solvers type+ PC type? instead of letting the code crunch numbers and eventually break loose...

Since all your solvers, preconditioners, etc. are ultimately identified by strings, 
this could be achievable I guess. 

It's a question that my code users often ask me too and I realize that for PETSC, with the massive number of choices you offer, could start make sense for saving debugging time... 
What do you think?

Andrea


On Jan 20, 2014, at 12:26 AM, Jed Brown <jed at jedbrown.org> wrote:

> Karl Rupp <rupp at mcs.anl.gov> writes:
>> It depends on the amount of stability one wants to put it. If Gauss 
>> without pivoting is sufficient, Block-Jacobi shouldn't be too hard for 
>> the block size of interest (9-by-9). 
> 
> This is called pbjacobi.  I think that is a better choice to put on the
> GPU.
> 
>> However, I suppose we do want robust solves, so it's better to use
>> some explicit solution formulas.  Cramer's rule is okay up to 4-by-4
>> blocks or maybe 5-by-5, but likely to be a mess for 9-by-9.
> 
> PBJacobi inverts all the diagonals.  I suggest just doing that on the
> CPU and then transfer the inverses to the GPU.  You'll need a custom
> kernel on the GPU.



More information about the petsc-dev mailing list