[petsc-users] Fwd: Performance RH5 vs RH6

Mohamad M. Nasr-Azadani mmnasr at gmail.com
Wed Jul 3 13:24:37 CDT 2013

---------- Forwarded message ----------
From: Mohamad M. Nasr-Azadani <mmnasr at gmail.com>
Date: Wed, Jul 3, 2013 at 11:24 AM
Subject: Re: [petsc-users] Performance RH5 vs RH6
To: Tabrez Ali <stali at geology.wisc.edu>

@ Matt.
Nope. The right thing to do is look at -log_summary output.

Thanks. I will try this later.

FWIW, I recently upgraded an OpenMPI installation from 1.4.x to 1.6.x and
observed very significant slow-downs.  You might want to try reverting your
MPI library.

That sounds a potential reason as I used to use openmpi 1.4 and now had to
recompile PETSc with 1.6. I will definitely give it a try to see if I see
any difference.
Thank you!

 Did you by chance turn debugging on while configuring PETSc?



On Wed, Jul 3, 2013 at 11:17 AM, Tabrez Ali <stali at geology.wisc.edu> wrote:

> Did you by chance turn debugging on while configuring PETSc?
> Tabrez
> On 07/03/2013 12:16 PM, Mohamad M. Nasr-Azadani wrote:
>> Hi guys,
>> Recently, a supercomputer I had been using for the past year, upgraded
>> their OS from RH5 to RH6. After recompiling PETSc along with Hypre with
>> various compilers (gcc and intel) and mpi packages (openmpi and mvapich2),
>> the performance I observe on RH6 is significantly worse, e.g. my code is
>> close to 30-40% slower.
>> My code is a finite-difference Navier-Stokes solver and uses BoomerAMG to
>> precondition the pressure Poisson equation.
>> I have not done a thorough profiling yet, but I was wondering if you have
>> encountered a similar experience before.
>> Thanks,
>> Mohamad
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130703/0ce621d6/attachment.html>

More information about the petsc-users mailing list