[petsc-dev] discrepancy with bcgs, mpi, cusp, and no preconditioning

Chetan Jhurani chetan.jhurani at gmail.com
Thu May 10 14:45:30 CDT 2012


Hi wizards of "->ops->" indirections,

I'm trying to use bcgs without preconditioning, for now, and
the iterations using "-vec_type cusp -mat_type mpiaijcusp" don't
match serial or non-GPU options.  I've attached the test program
and the 4 outputs (serial/parallel + CPU/GPU).  All this is
with petsc-dev downloaded now and real scalars.

Only the parallel GPU results are different starting from
third residual norm seen in results.txt.  The other three match
one another.  Am I doing something wrong?

fbcgs (bcgs with -ksp_bcgs_flexible) works fine with all the
serial/parallel or CPU/GPU options I've tried.

Let me know if you need the matrix, rhs, and initial guess 
binary files that are read in by the test program.

Thanks,

Chetan

-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: test_ksp.cpp
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120510/d2a5c52f/attachment.ksh>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: result.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120510/d2a5c52f/attachment.txt>


More information about the petsc-dev mailing list