[petsc-dev] boomerAmg scalability

Mark F. Adams mark.adams at columbia.edu
Thu Jan 12 22:03:26 CST 2012


Ravi, can you run with -ksp_view_binary? This will produce two files.

Hong, ex10 will read in these files and solve them.  I will probably not be able to get to this until Monday.

Also, this matrix has just two equations on proc 0 and and about 11000 on proc 1 so its is strangely balanced, in case that helps ...

Mark

On Jan 12, 2012, at 10:35 PM, Hong Zhang wrote:

> Ravi,
> 
> I need more info for debugging. Can you provide a simple stand alone code and matrices in petsc
> binary format that reproduce the error?
> 
> MatTransposeMatMult() for mpiaij is a newly developed subroutine - less than one month old 
> and not well tested yet :-(
> I used petsc-dev/src/mat/examples/tests/ex94.c for testing.
> 
> Thanks,
> 
> Hong
> 
> On Thu, Jan 12, 2012 at 9:17 PM, Mark F. Adams <mark.adams at columbia.edu> wrote:
> It looks like the problem is in MatTransposeMatMult and Hong (cc'ed) is working on it.
> 
> I'm hoping that your output will be enough for Hong to figure this out but I could not reproduce this problem with any of my tests.
> 
> If Hong can not figure this out then we will need to get the matrix from you to reproduce this.
> 
> Mark
> 
> 
> On Jan 12, 2012, at 6:25 PM, Ravi Kannan wrote:
> 
>> Hi Mark,
>>  
>> Any luck with the gamg bug fix?
>>  
>> Thanks,
>> Ravi.
>>  
>> From: petsc-dev-bounces at mcs.anl.gov [mailto:petsc-dev-bounces at mcs.anl.gov] On Behalf Of Mark F. Adams
>> Sent: Wednesday, January 11, 2012 1:54 PM
>> To: For users of the development version of PETSc
>> Subject: Re: [petsc-dev] boomerAmg scalability
>>  
>> This seems to be dying earlier than it was last week, so it looks like a new bug in MatTransposeMatMult.
>>  
>> Mark
>>  
>> On Jan 11, 2012, at 1:59 PM, Matthew Knepley wrote:
>> 
>> 
>> On Wed, Jan 11, 2012 at 12:23 PM, Ravi Kannan <rxk at cfdrc.com> wrote:
>> Hi Mark,
>>  
>> I downloaded the dev version again. This time, the program crashes even earlier. Attached is the serial and parallel info outputs.
>>  
>> Could you kindly take a look.
>>  
>> It looks like this is a problem with MatMatMult(). Can you try to reproduce this using KSP ex10? You put
>> your matrix in binary format and use -pc_type gamg. Then you can send us the matrix and we can track
>> it down. Or are you running an example there?
>>  
>>   Thanks,
>>  
>>     Matt
>>  
>>  
>>  
>> Thanks,
>> Ravi.
>>  
>> From: petsc-dev-bounces at mcs.anl.gov [mailto:petsc-dev-bounces at mcs.anl.gov] On Behalf Of Mark F. Adams
>> Sent: Monday, January 09, 2012 3:08 PM
>> 
>> To: For users of the development version of PETSc
>> Subject: Re: [petsc-dev] boomerAmg scalability
>>  
>>  
>> Yes its all checked it, just pull from dev.
>> Mark
>>  
>> On Jan 9, 2012, at 2:54 PM, Ravi Kannan wrote:
>>  
>> 
>> Hi Mark,
>>  
>> Thanks for your efforts.
>>  
>> Do I need to do the install from scratch once again? Or some particular files (check out gamg.c for instance)?
>>  
>> Thanks,
>> Ravi.
>>  
>> From: petsc-dev-bounces at mcs.anl.gov [mailto:petsc-dev-bounces at mcs.anl.gov] On Behalf Of Mark F. Adams
>> Sent: Friday, January 06, 2012 10:30 AM
>> To: For users of the development version of PETSc
>> Subject: Re: [petsc-dev] boomerAmg scalability
>>  
>> I think I found the problem.  You will need to use petsc-dev to get the fix.
>>  
>> Mark
>>  
>> On Jan 6, 2012, at 8:55 AM, Mark F. Adams wrote:
>> 
>> 
>> 
>> Ravi, I forgot but you can just use -ksp_view_binary to output the matrix data (two files).  You could run it with two procs and a Jacobi solver to get it past the solve, where it writes the matrix (I believe).
>> Mark
>>  
>> On Jan 5, 2012, at 6:19 PM, Ravi Kannan wrote:
>> 
>> 
>> 
>> Just send in another email with the attachment.
>>  
>> From: petsc-dev-bounces at mcs.anl.gov [mailto:petsc-dev-bounces at mcs.anl.gov] On Behalf Of Jed Brown
>> Sent: Thursday, January 05, 2012 5:15 PM
>> To: For users of the development version of PETSc
>> Subject: Re: [petsc-dev] boomerAmg scalability
>>  
>> On Thu, Jan 5, 2012 at 17:12, Ravi Kannan <rxk at cfdrc.com> wrote:
>> I have attached the verbose+info outputs for both the serial and the parallel (2 partitions). NOTE: the serial output at some location says PC=Jacobi! Is it implicitly converting the PC to a Jacobi?
>>  
>> Looks like you forgot the attachment.
>>  
>>  
>>  
>> 
>> 
>>  
>> -- 
>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>> -- Norbert Wiener
>>  
> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120112/4b936656/attachment.html>


More information about the petsc-dev mailing list