[petsc-users] PETSC matrix assembling super slow
Song Gao
song.gao2 at mail.mcgill.ca
Tue Jan 29 12:22:43 CST 2019
I think you would prefer to preallocate the matrix
uncomment this line
! call
MatMPIAIJSetPreallocation(Amat,f9,PETSC_NULL_INTEGER,f6,PETSC_NULL_INTEGER,
ierr)
Le mar. 29 janv. 2019, à 12 h 40, Yaxiong Chen via petsc-users <
petsc-users at mcs.anl.gov> a écrit :
> Hello,
>
>
> I have a 2D system which is assembled by each elemental matrix. Ae is my
> elemental matrix, auxRHSe(:) and RHSe(:) and corresponding right hand side,
> idx is the global index. My code is as follow, however ,the assembling rate
> is super slow(Marked red in the code). I am not sure whether the assembling
> type is right or not. Since for each element, idx are not continuous
> numbers. Do you have any idea what is the better way to assemble the matrix?
>
>
> Thanks
>
>
> block
> PetscErrorCode ierr
> PetscMPIInt rank,nproc, mystart
> PetscInt nelem
> integer,allocatable ::info(:)
> real(wp), allocatable :: Ae(:,:), auxRHSe(:),RHSe(:)
> integer, allocatable :: idx(:)
> PetscScalar, pointer :: xx_v(:)
>
> PC prec
> PetscScalar :: val
> Vec xvec,bvec,uvec
> Mat Amat
> KSP ksp
> PetscViewer viewer
> PetscInt geq,j,k,ne,M,Istart,Iend
> PetscBool flg
> KSPConvergedReason reason
> Vec dummyVec, dummyVecs(1)
> MatNullSpace nullspace
>
> call PetscInitialize( PETSC_NULL_CHARACTER, ierr )
>
> if (ierr .ne. 0) then
> print*,'Unable to initialize PETSc'
> stop
> endif
> call MPI_Comm_size(PETSC_COMM_WORLD,nproc,ierr)
> call MPI_Comm_rank(PETSC_COMM_WORLD,rank,ierr)
> mystart=rank+1
>
> ! Parameter set
> info=ptsystem%getInitInfo()
> nelem =
> info(Info_EleMatNum)+info(Info_FixedDOFNum)+info(Info_NumConstrain)
> print*,'nelem',nelem
> !-------------------------------------
> ! Create Matrix
> call MatCreate(PETSC_COMM_WORLD,Amat,ierr)
> call MatSetSizes( Amat,PETSC_DECIDE, PETSC_DECIDE, info(1), info(1),
> ierr )
> call MatSetType( Amat, MATMPIBAIJ, ierr )
> ! call
> MatMPIAIJSetPreallocation(Amat,f9,PETSC_NULL_INTEGER,f6,PETSC_NULL_INTEGER,
> ierr)
> call MatSetFromOptions( Amat, ierr )
> call MatSetUp( Amat, ierr )
> call MatGetOwnershipRange( Amat, Istart, Iend, ierr )
>
> xvec = tVec(0)
> call MatCreateVecs( Amat, PETSC_NULL_VEC, xvec, ierr )
> call VecSetFromOptions( xvec, ierr )
> call VecDuplicate( xvec, bvec, ierr )
> call VecDuplicate( xvec, uvec, ierr )
>
> t1 = MPI_WTime();
>
> do i=mystart,nelem,nproc
> call ptSystem%getElementalMAT(i, Ae, auxRHSe, idx)
> ne=size(idx)
> if (allocated(auxRHSe)) call
> VecSetValues(bvec,ne,idx-1,auxRHSe,ADD_VALUES,ierr)
> call MatSetValues(Amat,ne,idx-1,ne,idx-1,Ae,ADD_VALUES,ierr)
> end do
>
> nelem = info(Info_EleRHSNum)
> mystart = rank+1
>
> do i = mystart, nelem, nproc
> call ptSystem%getElementalRHS(i, RHSe, idx)
> print*,'idx',idx
> ne=size(idx)
> if (allocated(RHSe)) call
> VecSetValues(bvec,ne,idx-1,RHSe,ADD_VALUES,ierr)
> end do
> call MatAssemblyBegin(Amat,MAT_FINAL_ASSEMBLY,ierr)
> call MatAssemblyEnd(Amat,MAT_FINAL_ASSEMBLY,ierr)
> ! this part is slow, the for loop above is done but here it may get
> stuck
> call VecAssemblyBegin(bvec,ierr)
> ! For a 2500 DOF system, assembling only
> takes over 2 seconds
> call VecAssemblyEnd(bvec,ierr)
> ! But for a 10000 DOF system , it gets stuck
> t2 = MPI_WTime();
> print*,'assembeling time',t2-t1
> ! Solve
> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr)
> ! Set operators. Here the matrix that defines the linear system
> ! also serves as the preconditioning matrix.
> call KSPSetOperators(ksp,Amat,Amat,ierr)
> call KSPSetFromOptions(ksp,ierr)
> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> ! Solve the linear system
> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> call KSPSetType(ksp,KSPCG,ierr)
> call KSPGetPC(ksp,prec,ierr)
> ! call KSPSetPCSide(ksp,PC_SYMMETRIC,ierr)
> call PCSetType(prec,PCJACOBI,ierr)
> call KSPSolve(ksp,bvec,xvec,ierr)
> call PetscFinalize(ierr)
>
> end block
>
>
> Yaxiong Chen,
> Ph.D. Student
>
> School of Mechanical Engineering, 3171
>
> 585 Purdue Mall
>
> West Lafayette, IN 47907
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190129/b6f9cfb4/attachment.html>
More information about the petsc-users
mailing list