[petsc-users] How to use PETSc4py/SLEPc4py to solve an	eigenvalue problem in parallel
    Jed Brown 
    jed at jedbrown.org
       
    Wed Oct 29 22:52:48 CDT 2014
    
    
  
Mengqi Zhang <jollage at gmail.com> writes:
> It seems that the parallel running doesn't work, the elapsed time is
> roughly the same. 
http://www.mcs.anl.gov/petsc/documentation/faq.html#computers
> You see that in the process of assembling the matrix in the parallel
> case, the two cores are working according to
>
>     from  0  to  8192  on rank  0
>     from  8192  to  16384  on rank  1
>
> but in the SLEPc phase, only rank 0 is working (the print command only
> works for rank 0, but not for rank 1). 
That is just printing.  If every rank printed, you'd have a jumbled mess
of output.
Always send the output of -log_summary if you have questions about performance.
> I also tried to run the ex1.py in parallel and in serial, the
> computation times for a relative large matrix are also roughly the
> same for the two runnings, or in some cases, parallel running is even
> longer.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 818 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141029/25beceae/attachment.pgp>
    
    
More information about the petsc-users
mailing list