[petsc-dev] discrepancy with bcgs, mpi, cusp, and no preconditioning
Matthew Knepley
knepley at gmail.com
Thu May 10 14:49:55 CDT 2012
On Thu, May 10, 2012 at 3:45 PM, Chetan Jhurani <chetan.jhurani at gmail.com>wrote:
> Hi wizards of "->ops->" indirections,
>
> I'm trying to use bcgs without preconditioning, for now, and
> the iterations using "-vec_type cusp -mat_type mpiaijcusp" don't
> match serial or non-GPU options. I've attached the test program
> and the 4 outputs (serial/parallel + CPU/GPU). All this is
> with petsc-dev downloaded now and real scalars.
>
> Only the parallel GPU results are different starting from
> third residual norm seen in results.txt. The other three match
> one another. Am I doing something wrong?
>
> fbcgs (bcgs with -ksp_bcgs_flexible) works fine with all the
> serial/parallel or CPU/GPU options I've tried.
>
> Let me know if you need the matrix, rhs, and initial guess
> binary files that are read in by the test program.
>
That would be great. This looks like a bug that should be tracked down.
MAtt
> Thanks,
>
> Chetan
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120510/26dfa2ac/attachment.html>
More information about the petsc-dev
mailing list