[petsc-users] Good performance For small problem on GALILEO

Matthew Knepley knepley at gmail.com
Fri Oct 20 05:31:52 CDT 2017


Justin is right that parallelism will be of limited value for such small
systems. This looks like a serial optimization job.
Moreover, in this case, a better numerical method usually trumps any kind
of machine optimization.

   Matt

On Fri, Oct 20, 2017 at 2:55 AM, Justin Chang <jychang48 at gmail.com> wrote:

> 600 unknowns is way too small to parallelize. Need at least 10,000
> unknowns per MPI process: https://www.mcs.anl.gov/petsc/documentation/faq.
> html#slowerparallel
>
> What problem are you solving? Sounds like you either compiled PETSc with
> debugging mode on or you just have a really terrible solver. Show us the
> output of -log_view.
>
> On Fri, Oct 20, 2017 at 12:47 AM Luca Verzeroli <
> l.verzeroli at studenti.unibg.it> wrote:
>
>> Good morning,
>> For my thesis I'm dealing with GALILEO, one of the clusters owned by
>> Cineca. http://www.hpc.cineca.it/hardware/galileo
>> The first question is: What is the best configuration to run petsc on
>> this kind of cluster? My code is only a MPI program and I would like to
>> know if it's better to use more nodes or more CPUs with mpirun.
>> This question comes from the speed up of my code using that cluster. I
>> have a small problem. The global matrices are 600x600. Are they too small
>> to see a speed up with more mpiprocess? I notice that a single core
>> simulation and a multi cores one take a similar time (multi core a second
>> more). The real problem comes when I have to run multiple simulation of the
>> same code changing some parameters. So I would like to speed up the single
>> simulation.
>> Any advices?
>>
>>
>> Luca Verzeroli
>>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20171020/106cf8cf/attachment.html>


More information about the petsc-users mailing list