[petsc-dev] GAMG MatSetValues error

Mark F. Adams mark.adams at columbia.edu
Tue Sep 25 09:09:28 CDT 2012


Yes, this is not great.

I could really use a MatGetRowA and MatGetRowB here to get a more accurate number for B.  The problem is that it dies if this is too small and its not clear that there is a noticeable performance penalty with this (small beer).

I could save a bit by replacing

o_nnz[jj] = ncols;

with 

o_nnz[jj] = 3*ncols/4;

but this might even fail under some circumstances and would not save much.

So if you think this is a problem we could do something.  THis code already has MatAIJ specific code in it so I could just drill into it and get row sizes of A and B and do this efficiently if need be.

Mark

On Sep 25, 2012, at 8:49 AM, John Mousel <john.mousel at gmail.com> wrote:

> Matt,
> 
> It's not technically an error. It's an inefficient preallocation. I'm assuming Mark doesn't want this to be happening.
> 
> 
> [0] PCSetUp(): Setting up new PC
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777
> [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780
> [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777
> [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780
> [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780
> [1] MatAssemblyBegin_MPIAIJ(): Stash has 1830 entries, uses 0 mallocs.
> [3] MatAssemblyBegin_MPIAIJ(): Stash has 914 entries, uses 0 mallocs.
> [2] MatAssemblyBegin_MPIAIJ(): Stash has 1991 entries, uses 0 mallocs.
> [0] MatStashScatterBegin_Private(): No of messages: 3 
> [0] MatStashScatterBegin_Private(): Mesg_to: 1: size: 16200 
> [0] MatStashScatterBegin_Private(): Mesg_to: 2: size: 4192 
> [0] MatStashScatterBegin_Private(): Mesg_to: 3: size: 1800 
> [0] MatAssemblyBegin_MPIAIJ(): Stash has 2771 entries, uses 0 mallocs.
> [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 10840 X 10840; storage space: 0 unneeded,119860 used
> [2] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> [2] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 26
> [2] Mat_CheckInode(): Found 10840 nodes out of 10840 rows. Not using Inode routines
> [3] MatAssemblyEnd_SeqAIJ(): Matrix size: 10862 X 10862; storage space: 0 unneeded,117330 used
> [3] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> [3] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 26
> [3] Mat_CheckInode(): Found 10862 nodes out of 10862 rows. Not using Inode routines
> [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 12138 X 12138; storage space: 0 unneeded,116975 used
> [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 26
> [1] Mat_CheckInode(): Found 12138 nodes out of 12138 rows. Not using Inode routines
> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10765 X 10765; storage space: 0 unneeded,110555 used
> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 25
> [0] Mat_CheckInode(): Found 10765 nodes out of 10765 rows. Not using Inode routines
> [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780
> [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777
> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777
> [2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780
> [3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780
> [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780
> [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
> [0] VecScatterCreate(): General case: MPI to Seq
> [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 10840 X 566; storage space: 118775 unneeded,1100 used
> [2] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 1
> [2] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 10
> [3] MatAssemblyEnd_SeqAIJ(): Matrix size: 10862 X 259; storage space: 116725 unneeded,650 used
> [3] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 3
> [3] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 11
> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10765 X 1295; storage space: 107825 unneeded,3165 used
> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 29
> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 13
> [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 12138 X 1131; storage space: 114609 unneeded,2591 used
> [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 15
> [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 13
> 
> 
> John
> 
> 
> On Tue, Sep 25, 2012 at 5:41 AM, Matthew Knepley <knepley at gmail.com> wrote:
> On Mon, Sep 24, 2012 at 9:48 PM, John Mousel <john.mousel at gmail.com> wrote:
> I'm having a problem with GAMG causing a MatSetValues() is [some non-zero number] after
> 
> I do not see any errors in your output.
> 
>   Matt
>  
> [0] PCSetUp(): Setting up new PC
> 
>  I don't get this with either ILU or HYPRE/BoomerAMG. I've attached the output of -info. I see there is a recent change to some pre-allocation for GAMG here:
> 
> http://petsc.cs.iit.edu/petsc/petsc-dev/rev/fe806009181b
> 
> I'm not sure if that has anything to do with it, but my blocksize is 1. I'm running with the following options.
> 
> -ksp_type bcgsl -pc_type gamg -pc_gamg_threshold 0.01 -mg_levels_ksp_type richardson -mg_levels_pc_type sor -mg_coarse_ksp_type richardson -mg_coarse_pc_type sor -mg_coarse_pc_sor_its 4 -pc_gamg_type agg 
> -pc_gamg_agg_nsmooths 1 -pc_gamg_sym_graph true -info -ksp_monitor_true_residual
> 
> Thanks,
> John
> 
> 
> 
> 
> 
> 
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120925/04408045/attachment.html>


More information about the petsc-dev mailing list