[petsc-users] MatCreateBAIJ, SNES, Preallocation...

Smith, Barry F. bsmith at mcs.anl.gov
Fri May 17 02:12:13 CDT 2019



> On May 16, 2019, at 5:28 PM, William Coirier <William.Coirier at kratosdefense.com> wrote:
> 
> Ok, got it. My misinterpretation was how to fill the d_nnz and o_nnz arrays. 
> 
> Thank you for your help!
> 
> Might I make a suggestion related to the documentation? Perhaps I have not fully read the page on the MatMPIBAIJSetPreallocation so you can simply disregard and I'm ok with that! The documentation has for the d_nnz:
> 
> d_nnz 	- array containing the number of block nonzeros in the various block rows of the in diagonal portion of the local (possibly different for each block row) or NULL. If you plan to factor the matrix you must leave room for the diagonal entry and set it even if it is zero. 
> 
> Am I correct in that this array should be of size numRows, where numRows is found from  calling MatGetOwnershipRange(J,&iLow,&iHigh) so numRows=iHigh-iLow.
> 
> I think my error was allocating this only to be numRows/bs since I thought it's a block size thing.

   You should no need set any more than that. So something still seems a bit odd.

   It is a block size thing. Are you sure he bs in you  code matches here matches the block sie
> 
> When I fixed this, things worked really fast! Maybe it's obvious it should be that size. :-)
> 
> Regardless, thanks for your help. PETSc is great! I'm a believer and user.
> ________________________________________
> From: Smith, Barry F. [bsmith at mcs.anl.gov]
> Sent: Thursday, May 16, 2019 4:07 PM
> To: William Coirier
> Cc: petsc-users at mcs.anl.gov; Michael Robinson; Andrew Holm
> Subject: Re: [petsc-users] MatCreateBAIJ, SNES, Preallocation...
> 
>> On May 16, 2019, at 3:44 PM, William Coirier via petsc-users <petsc-users at mcs.anl.gov> wrote:
>> 
>> Folks:
>> 
>> I'm developing an application using the SNES, and overall it's working great, as many of our other PETSc-based projects. But, I'm having a problem related to (presumably) pre-allocation, block matrices and SNES.
>> 
>> Without going into details about the actual problem we are solving, here are the symptoms/characteristics/behavior.
>>      • For the SNES Jacobian, I'm using MatCreateBAIJ for a block size=3, and letting "PETSC_DECIDE" the partitioning. Actual call is:
>>              • ierr = MatCreateBAIJ(PETSC_COMM_WORLD, bs, PETSC_DECIDE, PETSC_DECIDE, (int)3 * numNodesSAM, (int)3 * numNodesSAM, PETSC_DEFAULT, NULL, PETSC_DEFAULT, NULL, &J);
>>      • When registering the SNES jacobian function, I set the B and J matrices to be the same.
>>              •     ierr = SNESSetJacobian(snes, J, J, SAMformSNESJ, (void *)this); CHKERRQ(ierr);
>>      • I can either let PETSc figure out the allocation structure:
>>              • ierr = MatMPIBAIJSetPreallocation(J, bs, PETSC_DEFAULT, NULL,PETSC_DEFAULT, NULL);
>>      •  or, do it myself, since I know the fill pattern,
>>              • ierr = MatMPIBAIJSetPreallocation(J, bs, d_nz_dum,&d_nnz[0],o_nz_dum,&o_nnz[0]);
>> The symptoms/problems are as follows:
>>      • Whether I do preallocation or not, the "setup" time is pretty long. It might take 2 minutes before SNES starts doing its thing. After this setup, convergence and speed is great. But this first phase takes a long time. I'm assuming this has to be related to some poor preallocation setup so it's doing tons of mallocs where it’s not needed.
> 
>   You should definitely get much better performance with proper preallocation then with none (unless the default is enough for your matrix). Run with -info and grep for "malloc" this will tell you exactly how many, if any mallocs are taking place inside the MatSetValues() due to improper preallocation.
> 
>>      • If I don't call my Jacobian formulation before calling SNESSolve, I get a segmentation violation in a PETSc routine.
> 
>  Not sure what you mean by Jacobian formation but I'm guessing filling up the Jacobian with numerical values?
> 
>  Something is wrong because you should not need to fill up the values before calling SNES solve, and regardless it should never ever crash with
> a segmentation violation. You can run with valgrind https://urldefense.proofpoint.com/v2/url?u=https-3A__www.mcs.anl.gov_petsc_documentation_faq.html-23valgrind&d=DwIGaQ&c=zeCCs5WLaN-HWPHrpXwbFoOqeS0G3NH2_2IQ_bzV13g&r=q_3hswOPAFb0l_4-IAZZi5DgTpzDUIpk984njq2YnggBd-vCWTgNlbk27KjHXmKK&m=IREQmkCt5PXK-SnLqJZXz3Du7h3mFP24xtI0jHGgGUY&s=UiGHkQ2Zr_nYYQ-GYg1HEYtbqZutYSgv9F1A86sfNKI&e=  to make sure that it is not a memory corruption issue. You can also run in the debugger (perhaps the PETSc command line option -start_in_debugger) to get more details on why it is crashing.
> 
>  When you have it running satisfactory you can send us the output from running with -log_view and we can let you know how it seems to be performing efficiency wise.
> 
>   Barry
> 
> 
> 
>> (If I DO call my Jacobian first, things work great, although slow for the setup phase.) Here's a snippet of the traceback:
>> 0  0x00000000009649fc in MatMultAdd_SeqBAIJ_3 (A=<optimized out>,
>>    xx=0x3a525b0, yy=0x3a531b0, zz=0x3a531b0)
>>    at /home/jstutts/Downloads/petsc-3.11.1/src/mat/impls/baij/seq/baij2.c:1424
>> #1  0x00000000006444cb in MatMult_MPIBAIJ (A=0x15da340, xx=0x3a542a0,
>>    yy=0x3a531b0)
>>    at /home/jstutts/Downloads/petsc-3.11.1/src/mat/impls/baij/mpi/mpibaij.c:1380
>> #2  0x00000000005b2c0f in MatMult (mat=0x15da340, x=x at entry=0x3a542a0,
>>    y=y at entry=0x3a531b0)
>>    at /home/jstutts/Downloads/petsc-3.11.1/src/mat/interface/matrix.c:2396
>> #3  0x0000000000c61f2e in PCApplyBAorAB (pc=0x1ce78c0, side=PC_LEFT,
>>    x=0x3a542a0, y=y at entry=0x3a548a0, work=0x3a531b0)
>>    at /home/jstutts/Downloads/petsc-3.11.1/src/ksp/pc/interface/precon.c:690
>> #4  0x0000000000ccb36b in KSP_PCApplyBAorAB (w=<optimized out>, y=0x3a548a0,
>>    x=<optimized out>, ksp=0x1d44d50)
>>    at /home/jstutts/Downloads/petsc-3.11.1/include/petsc/private/kspimpl.h:309
>> #5  KSPGMRESCycle (itcount=itcount at entry=0x7fffffffc02c,
>>    ksp=ksp at entry=0x1d44d50)
>>    at /home/jstutts/Downloads/petsc-3.11.1/src/ksp/ksp/impls/gmres/gmres.c:152
>> #6  0x0000000000ccbf6f in KSPSolve_GMRES (ksp=0x1d44d50)
>>    at /home/jstutts/Downloads/petsc-3.11.1/src/ksp/ksp/impls/gmres/gmres.c:237
>> #7  0x00000000007dc193 in KSPSolve (ksp=0x1d44d50, b=b at entry=0x1d41c70,
>>    x=x at entry=0x1cebf40)
>> 
>> 
>> 
>> I apologize if I’ve missed something in the documentation or examples, but I can’t seem to figure this one out. The “setup” seems to take too long, and from my previous experiences with PETSc, this is due to a poor preallocation strategy.
>> 
>> Any and all help is appreciated!
>> 
>> -----------------------------------------------------------------------
>> William J. Coirier, Ph.D.
>> Director, Aerosciences and Engineering Analysis Branch
>> Advanced Concepts Development and Test Division
>> Kratos Defense and Rocket Support Services
>> 4904 Research Drive
>> Huntsville, AL 35805
>> 256-327-8170
>> 256-327-8120 (fax)
> 



More information about the petsc-users mailing list