[petsc-users] testing scalability for ksp/ex22.c

Francis Poulin fpoulin at uwaterloo.ca
Thu Feb 23 18:58:39 CST 2012


Hello Barry,

I can do it for each of them if that helps but I suspect the method is the same so I'm sending the information for the first 3, n = 2, 4, 8.  In the mean  time I will figure out how to change the number of levels..

Thanks,
Francis

-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: saw_log_summary_n8_info.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120223/2dd782fe/attachment-0003.txt>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: saw_log_summary_n4_info.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120223/2dd782fe/attachment-0004.txt>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: saw_log_summary_n2_info.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120223/2dd782fe/attachment-0005.txt>
-------------- next part --------------

On 2012-02-23, at 4:20 PM, Barry Smith wrote:

> 
>  Still need the -ksp_view output. It is spending most of the time in the LU factorization and solve. I suspect the coarse problem is way to big (like you are using two levels of multigrid) and since it is solved redundantly that takes all the time. Run with say 5 levels.
> 
>   Barry
> 
> On Feb 23, 2012, at 3:03 PM, Francis Poulin wrote:
> 
>> Hello again,
>> 
>> I am using v3.1 of PETSc.
>> 
>> I changed the grid sizes slightly and I'm including 4 log_summary files.
>> 
>> The times are shown below.  I have not modified the example at all except in specifying the matrix size.  Could it be that I need much larger?  When I tried much larger matrices I think I might have got an error because I was using too much memory.
>> 
>> n 		time
>> 2		22s
>> 4		29.8s
>> 8		33.7s
>> 16		28.3s
>> 
>> Sorry for my first email but I hope this has more information.
>> 
>> Cheers, Francis
>> 
>> 
>> <saw_log_summary_n2.txt>
>> <saw_log_summary_n4.txt>
>> <saw_log_summary_n8.txt>
>> <saw_log_summary_n16.txt>
>>> 
>> 
>> On 2012-02-23, at 3:27 PM, Jed Brown wrote:
>> 
>>> Always send output with -log_summary for each run that you do.
>>> On Thu, Feb 23, 2012 at 14:16, Francis Poulin <fpoulin at uwaterloo.ca> wrote:
>>> Hello,
>>> 
>>> I am learning to use PetSc but am just a notice.  I have a rather basic question to ask and couldn't not find it on the achieves.
>>> 
>>> I am wanting to test the scalability of a Multigrid solver to the 3D Poisson equation.  I found ksp/ex22.c that seems to solve the problem that I'm interested in.  I ran it on a large server using different processors.
>>> 
>>> The syntax that I use to run using MPI was
>>> 
>>> ./ex22 -da_grid_x 64 -da_grid_y 64 -da_grid_z 32
>>> 
>>> Which version of PETSc?
>>> 
>>> 
>>> I tested it using 2, 4, 8, 16 cpus and found that the time increases.  See below.  Clearly there is something that I don't understand since the time should be reduced.
>>> 
>>> n               wtime
>>> ---------------------
>>> 2               3m58s
>>> 4               3m54s
>>> 8               5m51s
>>> 16              7m23s
>>> 
>>> Any advice would be greatly appreciated.
>>> 
>>> Best regrads,
>>> Francis
>>> 
>>> 
>>> 
>> 
> 



More information about the petsc-users mailing list