[petsc-users] How to parallelize a linear system solver A*x=q in a scalar code

Valerio Grazioso graziosov at libero.it
Tue Oct 14 06:40:54 CDT 2014


Dear Jed,
thanks for the reply!

>> Dear All,
>> I have to parallelize an old fortran77-fortran90 cfd scalar
> 
> By "scalar", do you mean "serial”?

Yes, by scalar I mean serial.

>> code. Before starting from scratch, I decided to parallelize “just”
>> the pressure solver (for instance a linear system solver A*x=q).
>> Because of the subroutines call tree, and in particular for the fact a
>> lot of subroutines are doing some output, it would be of great help if
>> I could let just one process run trough the main, leaving all the
>> others on hold (with an MPI_Barrier), just to the point where I create
>> an MPI matrix (an MPI vectors as well) with MatCreate, and then
>> resolve the “parallel” system. Is this plan feasible with petsc ?
> 
> You can create parallel objects, but call MatSetValues and VecSetValues
> only from rank 0 (where the rest of the code runs).  When done, call
> MatAssemblyBegin/MatAssemblyEnd and VecAssemblyBegin/VecAssemblyEnd
> collectively.  PETSc will communicate the data where it needs to go.
> 

Thanks for the advice. I’ll give it a try! 

> Be warned that this is totally non-scalable and cannot provide more than
> a small speedup.
> 
> https://en.wikipedia.org/wiki/Amdahl%27s_law

I totally agree. But my idea was to “use” this as an exercise: I have to get to know the serial code,
and Petsc! 

> Indeed, depending on the problem configuration and the algorithms used
> by the current implementation, you might get more speedup in the by
> implementing a quality geometric multigrid solver in the serial code
> than by "parallelizing" only this bit.
> 
> If the present code uses good algorithms, I would encourage doing the
> hard work of making it fully parallel as soon as possible.


Yes, somehow I’m betting against the code: as a linear solver it uses a SOR iterative method (and I don’t think that the relaxation factor is well chosen).
Just using Petsc’s ksp methods (including PC preconditioners) I hope to find faster solvers, even in the serial case.

Regards
Valerio







More information about the petsc-users mailing list