[petsc-users] Communication time with threading enabled.
Gonçalo Albuquerque
goncalo.albuquerque at insilicio.com
Thu Feb 14 14:06:17 CST 2013
Dear Shri,
Just tried with --download-mpich and all is well. Thanks.
Confirmed that using --download-openmpi (instead of using Ubuntu's OpenMPI)
produces the same slowdown in communications. Another visible symptom is
that it takes a long time to print the log_summary data.
Any hints as to why OpenMPI displays this behavior.
Thank you again for your help.
Regards,
Gonçalo
On Thu, Feb 14, 2013 at 5:32 PM, Shri <abhyshr at mcs.anl.gov> wrote:
> This has something to do with OpenMPI. I cannot reproduce this issue with
> MPICH. Can you try switching to MPICH (--download-mpich).
>
> Shri
> On Feb 14, 2013, at 6:58 AM, Gonçalo Albuquerque wrote:
>
> > Dear All,
> >
> > I'm experimenting with PETSc hybrid MPI/OpenMP capabilities and I have
> run a rather simple test case (2D magnetostatic) using PETSc compiled with
> both OpenMPI and thread support (both PThreads and OpenMP) on a Ubuntu
> 12.04 system. I cannot figure out the results obtained when comparing runs
> made using the same number of MPI processes (2) and specifying either no
> threads (-threadcomm_type nothread) or 1 OpenMP thread (-threadcomm_type
> openmp -threadcomm_nthreads 1). I attached the logs of both runs. It seems
> that the communication time has literally exploded. A grep over the logs
> gives:
> >
> > No threading:
> > Average time for MPI_Barrier(): 1.38283e-06
> > Average time for zero size MPI_Send(): 7.03335e-06
> >
> > 1 OpenMP thread:
> > Average time for MPI_Barrier(): 0.00870218
> > Average time for zero size MPI_Send(): 0.00614798
> >
> > The same things is occurring when running ksp ex5 (see attached logs).
> >
> > Any ideas as to what I'm missing?
> >
> > Many thanks in advance,
> >
> > Gonçalo
> >
> <nothread.log><openmp_nthreads_1.log><ex5_nothread.log><ex5_openmp_nthreads_1.log>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130214/e6152554/attachment.html>
More information about the petsc-users
mailing list