[petsc-users] Parallelization of the code

Maahi Talukder maahi.buet at gmail.com
Tue Feb 12 17:42:22 CST 2019


Dear All,


I am tying to solve a linear system using ksp solvers. I have managed to
solve the system with a sequential code. The part of my sequential code
that deals with creating Matrix and setting values is as the following -

call MatCreate(PETSC_COMM_WORLD,Mp,ierr)
call MatSetSizes(Mp,PETSC_DECIDE,PETSC_DECIDE,u*v,u*v,ierr)
call MatSetFromOptions(Mp,ierr)
call MatSetUp(Mp,ierr)

Do p = 1,29008
Do r = 1,29008
if(Q(p,r)/=0.0) then
val(1) = Q(p,r)
col(1) = r-1
call MatSetValues(Mp,ione,p-1,ione,col,val,INSERT_VALUES,ierr)
endif
end Do
end Do

call MatAssemblyBegin(Mp,MAT_FINAL_ASSEMBLY,ierr)
call MatAssemblyEnd(Mp,MAT_FINAL_ASSEMBLY,ierr)

And the part of my sequential code that creates the vector is -

call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,u*v,Bx,ierr)
call VecSetFromOptions(Bx,ierr)
call VecDuplicate(Bx,Xp,ierr)
call VecSet(Bx,zero,ierr)

Do p = 1,29008
if(Fx(p,1)/=0.0) then
val(1) = Fx(p,1)
call VecSetValues(Bx,ione,p-1,val,INSERT_VALUES,ierr)
endif
end Do

call VecAssemblyBegin(Bx,ierr)
call VecAssemblyEnd(Bx,ierr)

So when I run the code on single processor, it runs fine. But when I tried
to run it on more than one processor, it failed.  Now from what I
understood from going through the manual is that if I use MatCreate to
create my Matrix, then depending on the no of processor  that I put in
after 'mpiexec -n ...' , it either acts either as a sequential code or a
parallel code. And I don't need to anything extra to make it work in
parallel, as PETSc does that internally.

So would you please let me know  what modifications I need to do to my
existing sequential code to make it work in parallel like using
MatGetOwnershipRange ?

Regards,
Maahi Talukder
MSc Student
Clarkson University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190212/708ad79d/attachment.html>


More information about the petsc-users mailing list