[petsc-users] DMSetLabelValue takes a lot of time for large domain
Danyang Su
danyang.su at gmail.com
Wed Apr 25 11:40:28 CDT 2018
Hi Matthew,
In the worst case, every node/cell may have different label.
Below is one of the worst scenario with 102299 nodes and 102299
different labels for test. I found the time cost increase during the
loop. The first 9300 loop takes least time (<0.5) while the last 9300
loops takes much more time (>7.7), as shown below. If I use larger mesh
with >1 million nodes, it runs very very slowly in this part. The PETSc
is configured with optimization on.
Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran
--download-mpich --download-scalapack --download-parmetis
--download-metis --download-ptscotch --download-fblaslapack
--download-hypre --download-superlu_dist --download-hdf5=yes
--download-ctetgen --with-debugging=0 COPTFLAGS="-O3 -march=native
-mtune=native" CXXOPTFLAGS="-O3 -march=native -mtune=native"
FOPTFLAGS="-O3 -march=native -mtune=native"
istart iend progress CPU_Time time cost - old (sec) time cost - new
(sec)
0 9299 0 1524670045.51166
9300 18599 0.100010753 1524670045.99605 0.4843890667 0.497246027
18600 27899 0.200010747 1524670047.32635 1.330302 1.3820912838
27900 37199 0.300010741 1524670049.3066 1.9802515507 2.2439446449
37200 46499 0.400010765 1524670052.1594 2.852804184 3.0739262104
46500 55799 0.500010729 1524670055.90961 3.7502081394 3.9270553589
55800 65099 0.600010753 1524670060.47654 4.5669286251 4.7571902275
65100 74399 0.700010777 1524670066.0941 5.6175630093 5.7428796291
74400 83699 0.800010741 1524670072.53886 6.44475317 6.5761549473
83700 92998 0.900010765 1524670079.99072 7.4518604279 7.4606924057
92999 102298 1 1524670087.71066 7.7199423313 8.2424075603
old code
do ipoint = 0, istart-1
!c output time cost, use 1 processor to test
if (b_enable_output .and. rank == 0) then
if (mod(ipoint,iprogress) == 0 .or. ipoint == istart-1) then
!write(*,'(f3.1,1x)',advance="no") (ipoint+1.0)/istart
write(*,*) ipoint, (ipoint+1.0)/istart,"time",MPI_Wtime()
end if
end if
call DMSetLabelValue(dmda_flow%da,"cid_lg2g",ipoint, &
ipoint+1,ierr)
CHKERRQ(ierr)
end do
new code
call DMCreateLabel(dmda_flow%da,'cid_lg2g',ierr)
CHKERRQ(ierr)
call DMGetLabel(dmda_flow%da,'cid_lg2g',label, ierr)
CHKERRQ(ierr)
do ipoint = 0, istart-1
!c output time cost, use 1 processor to test
if (b_enable_output .and. rank == 0) then
if (mod(ipoint,iprogress) == 0 .or. ipoint == istart-1) then
!write(*,'(f3.1,1x)',advance="no") (ipoint+1.0)/istart
write(*,*) ipoint, (ipoint+1.0)/istart,"time",MPI_Wtime()
end if
end if
call DMLabelSetValue(label,ipoint,ipoint+1,ierr)
CHKERRQ(ierr)
end do
Thanks,
Danyang
On 2018-04-25 03:16 AM, Matthew Knepley wrote:
> On Tue, Apr 24, 2018 at 11:57 PM, Danyang Su <danyang.su at gmail.com
> <mailto:danyang.su at gmail.com>> wrote:
>
> Hi All,
>
> I use DMPlex in unstructured grid code and recently found
> DMSetLabelValue takes a lot of time for large problem, e.g., num.
> of cells > 1 million. In my code, I use
>
>
> I read your code wrong. For large loop, you should not use the
> convenience function. You should use
>
> DMPlexCreateFromCellList ()
>
>
> DMGetLabel(dm, name, &label)
>
>
> Loop over all cells/nodes{
>
> DMSetLabelValue
>
>
> Replace this by DMLabelSetValue(label, point, val)
>
> }
>
> DMPlexDistribute
>
> The code works fine except DMSetLabelValue takes a lot of time for
> large problem. I use DMSetLabelValue to set material id for all
> the nodes or cells so that each subdomain has a copy of material
> id. Is there any other functions that can be used more efficient,
> e.g. set labels by array, not 1 by 1?
>
>
> That should take much less time.
>
> Thanks,
>
> Matt
>
> Thanks,
>
> Danyang
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/%7Emk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180425/33601659/attachment.html>
More information about the petsc-users
mailing list