[petsc-users] Problem when switching from debug to optimized

Matthew Knepley knepley at gmail.com
Thu Nov 10 08:13:28 CST 2011


On Thu, Nov 10, 2011 at 12:30 PM, Bogdan Dita <bogdan at lmn.pub.ro> wrote:

>
>  Hello,
>
>  Until a few days ago I've only be using PETSc in debug mode and when I
> switch to the optimised version(--with-debugging=0) I got a strange
> result regarding the solve time, what I mean is that it was 10-15 %
> higher then in debug mode.
>  I'm trying to solve a linear system in parallel with superlu_dist, and
> I've tested my program on a Beowulf cluster, so far only using a single
> node with 2 quad-core Intel processors.
>  From what I know the "no debug" version should be faster and I know it
> should be faster because on my laptop(dual-core Intel) for the same
> program and even the same matrices the solve time for the optimised
> version is 2 times faster, but when I use the cluster the optimised
> version time is slower then the debug version.
>  My quess is that this has something to do with MPI. Any thoughts?
>

Performance questions are meaningless without the output of -log_summary
for all cases.

   Matt


>  Best regards,
>   Bogdan Dita
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20111110/d42cb42b/attachment.htm>


More information about the petsc-users mailing list