[petsc-users] DOF in DMDACreate3d and staggered grid

Barry Smith bsmith at mcs.anl.gov
Mon Jun 18 22:36:28 CDT 2012


On Jun 18, 2012, at 10:34 PM, TAY wee-beng wrote:

> 
> On 18/6/2012 10:58 PM, Barry Smith wrote:
>> On Jun 18, 2012, at 9:54 PM, TAY wee-beng wrote:
>> 
>>> On 18/6/2012 10:36 PM, Barry Smith wrote:
>>>>   Do PETSc examples run on this system. cd src/snes/examples/tutorials; make ex19 ; then run ex19 on a bunch of nodes of the cluster, does it run correctly? Repeat with ex5f
>>>> 
>>>>    Barry
>>> ex19 and ex5f both run w/o problem.
>>> 
>>> [wtay at hpc12:tutorials]$ mpiexec -np 2 ./ex19
>>> lid velocity = 0.0625, prandtl # = 1, grashof # = 1
>>> Number of SNES iterations = 2
>>> [wtay at hpc12:tutorials]$ mpiexec -np 4 ./ex19
>>> lid velocity = 0.0625, prandtl # = 1, grashof # = 1
>>> Number of SNES iterations = 2
>>> 
>>> [wtay at hpc12:tutorials]$ ./ex5f
>>> Number of SNES iterations =     4
>>> [wtay at hpc12:tutorials]$ mpiexec -np 2 ./ex5f
>>> Number of SNES iterations =     4
>>> [wtay at hpc12:tutorials]$ mpiexec -np 4 ./ex5f
>>> Number of SNES iterations =     4
>>> [wtay at hpc12:tutorials]$
>>> 
>>> The problem seems to lie in DMDAVecGetArrayF90 and DMDAVecRestoreArrayF90. I have reported this problem before when I convert ex29 from C to Fortran. It works in Vs2008 windows but not in linux. I wonder if it has some bugs or compatibility problem.
>>    Then likely it is an Intel compiler bug. Does this happen with the gnu gfortran compiler combined with gcc as the c compiler?
>> 
>>   Barry
> I have not tested. I'm also using intel fortran with VS2008 in windows 7 64bit. Do I have to re-build PETSc to test with gcc and gfortran?

  Yes. The intel compilers will be slightly different on Linux and Windows and may have bug on linux but not on windows.

  Barry

> 
> Thanks.
>>>> On Jun 18, 2012, at 9:21 PM, TAY wee-beng wrote:
>>>> 
>>>>> On 18/6/2012 9:52 PM, Matthew Knepley wrote:
>>>>>> On Mon, Jun 18, 2012 at 7:30 PM, TAY wee-beng<zonexo at gmail.com>   wrote:
>>>>>> 
>>>>>> On 17/6/2012 2:33 PM, Jed Brown wrote:
>>>>>>> On Sun, Jun 17, 2012 at 7:26 AM, TAY wee-beng<zonexo at gmail.com>   wrote:
>>>>>>> 
>>>>>>> On 16/6/2012 9:24 AM, Jed Brown wrote:
>>>>>>>> It depends how you want to solve the problem. I usually group all dofs together. There is a 2D Stokes+Thermodynamics example in SNES ex30 (or 31?).
>>>>>>>> 
>>>>>>> I tried to understand ex30. I have some questions since it's in C and I'm using to Fortran programming.
>>>>>>> 
>>>>>>> This looks about right, see src/dm/examples/tutorials/ex11f90.F.
>>>>>> I tried to build and run ex11f90 in linux. However, it's not working:
>>>>>> 
>>>>>> I have attached the run and valgrind output:
>>>>>> 
>>>>>> It appears to be a problem with your MPI.
>>>>>> 
>>>>>>    Matt
>>>>> Is there anyway to solve this problem? I'm running on a cluster using openmpi 1.5.3 and I've no admin rights.
>>>>>> run output:
>>>>>> 
>>>>>> [wtay at hpc12:tutorials]$ ./ex11f90
>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------
>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
>>>>>> [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
>>>>>> [0]PETSC ERROR: likely location of problem given in stack below
>>>>>> [0]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
>>>>>> [0]PETSC ERROR:       INSTEAD the line number of the start of the function
>>>>>> [0]PETSC ERROR:       is given.
>>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------
>>>>>> [0]PETSC ERROR: Signal received!
>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------
>>>>>> [0]PETSC ERROR: Petsc Development HG revision: f3b998b41b349e16d47fe42b0e223d3462737e05  HG Date: Fri Jun 15 17:50:32 2012 -0500
>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages.
>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------
>>>>>> [0]PETSC ERROR: ./ex11f90 on a petsc-3.3 named hpc12 by wtay Tue Jun 19 03:22:52 2012
>>>>>> [0]PETSC ERROR: Libraries linked from /home/wtay/Lib/petsc-3.3-dev_shared_debug/lib
>>>>>> [0]PETSC ERROR: Configure run at Sun Jun 17 16:51:29 2012
>>>>>> [0]PETSC ERROR: Configure options --with-mpi-dir=/opt/openmpi-1.5.3/ --with-blas-lapack-dir=/opt/intelcpro-11.1.059/mkl/lib/em64t/ --with-debugging=1 --download-hypre=1 --prefix=/home/wtay/Lib/petsc-3.3-dev_shared_debug --known-mpi-shared=1 --with-shared-libraries
>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------
>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
>>>>>> --------------------------------------------------------------------------
>>>>>> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
>>>>>> with errorcode 59.
>>>>>> 
>>>>>> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
>>>>>> You may or may not see output from other processes, depending on
>>>>>> exactly when Open MPI kills them.
>>>>>> --------------------------------------------------------------------------
>>>>>> 
>>>>>> valgrind output:
>>>>>> 
>>>>>> ==6027== Memcheck, a memory error detector
>>>>>> ==6027== Copyright (C) 2002-2011, and GNU GPL'd, by Julian Seward et al.
>>>>>> ==6027== Using Valgrind-3.7.0 and LibVEX; rerun with -h for copyright info
>>>>>> ==6027== Command: ex11f90
>>>>>> ==6027==
>>>>>> ==6027== Invalid read of size 8
>>>>>> ==6027==    at 0xA60148D: _wordcopy_fwd_dest_aligned (in /lib64/libc-2.12.so)
>>>>>> ==6027==    by 0xA5FB11D: __GI_memmove (in /lib64/libc-2.12.so)
>>>>>> ==6027==    by 0xA6027DB: argz_insert (in /lib64/libc-2.12.so)
>>>>>> ==6027==    by 0x95CAF25: lt_argz_insert (ltdl.c:1679)
>>>>>> ==6027==    by 0x95CB7D0: foreachfile_callback (ltdl.c:1718)
>>>>>> ==6027==    by 0x95CB4F1: foreach_dirinpath (ltdl.c:710)
>>>>>> ==6027==    by 0x95CB580: lt_dlforeachfile (ltdl.c:1865)
>>>>>> ==6027==    by 0x95DB999: mca_base_component_find (mca_base_component_find.c:301)
>>>>>> ==6027==    by 0x95DC4B0: mca_base_components_open (mca_base_components_open.c:128)
>>>>>> ==6027==    by 0x95F7CE7: opal_paffinity_base_open (paffinity_base_open.c:112)
>>>>>> ==6027==    by 0x95C39FE: opal_init (opal_init.c:307)
>>>>>> ==6027==    by 0x95815A4: orte_init (orte_init.c:78)
>>>>>> ==6027==  Address 0xb39d9a8 is 40 bytes inside a block of size 43 alloc'd
>>>>>> ==6027==    at 0x4C267BA: malloc (vg_replace_malloc.c:263)
>>>>>> ==6027==    by 0x95CA358: lt__malloc (lt__alloc.c:54)
>>>>>> ==6027==    by 0x95CB751: foreachfile_callback (ltdl.c:1764)
>>>>>> ==6027==    by 0x95CB4F1: foreach_dirinpath (ltdl.c:710)
>>>>>> ==6027==    by 0x95CB580: lt_dlforeachfile (ltdl.c:1865)
>>>>>> ==6027==    by 0x95DB999: mca_base_component_find (mca_base_component_find.c:301)
>>>>>> ==6027==    by 0x95DC4B0: mca_base_components_open (mca_base_components_open.c:128)
>>>>>> ==6027==    by 0x95F7CE7: opal_paffinity_base_open (paffinity_base_open.c:112)
>>>>>> ==6027==    by 0x95C39FE: opal_init (opal_init.c:307)
>>>>>> ==6027==    by 0x95815A4: orte_init (orte_init.c:78)
>>>>>> ==6027==    by 0x95432AE: ompi_mpi_init (ompi_mpi_init.c:350)
>>>>>> ==6027==    by 0x955938F: PMPI_Init (pinit.c:84)
>>>>>> ==6027==
>>>>>> ==6027== Syscall param writev(vector[...]) points to uninitialised byte(s)
>>>>>> ==6027==    at 0xA65692B: writev (in /lib64/libc-2.12.so)
>>>>>> ==6027==    by 0xC9A2C56: mca_oob_tcp_msg_send_handler (oob_tcp_msg.c:249)
>>>>>> ==6027==    by 0xC9A417C: mca_oob_tcp_peer_send (oob_tcp_peer.c:204)
>>>>>> ==6027==    by 0xC9A67FC: mca_oob_tcp_send_nb (oob_tcp_send.c:167)
>>>>>> ==6027==    by 0xC3953B5: orte_rml_oob_send (rml_oob_send.c:136)
>>>>>> ==6027==    by 0xC3955FF: orte_rml_oob_send_buffer (rml_oob_send.c:270)
>>>>>> ==6027==    by 0xCDB1E87: modex (grpcomm_bad_module.c:573)
>>>>>> ==6027==    by 0x95436F1: ompi_mpi_init (ompi_mpi_init.c:682)
>>>>>> ==6027==    by 0x955938F: PMPI_Init (pinit.c:84)
>>>>>> ==6027==    by 0x8AB4FF4: MPI_INIT (pinit_f.c:75)
>>>>>> ==6027==    by 0x50ED97E: petscinitialize_ (zstart.c:299)
>>>>>> ==6027==    by 0x40881C: MAIN__ (ex11f90.F:43)
>>>>>> ==6027==  Address 0xed30cd1 is 161 bytes inside a block of size 256 alloc'd
>>>>>> ==6027==    at 0x4C268B2: realloc (vg_replace_malloc.c:632)
>>>>>> ==6027==    by 0x95C4F22: opal_dss_buffer_extend (dss_internal_functions.c:63)
>>>>>> ==6027==    by 0x95C5A64: opal_dss_copy_payload (dss_load_unload.c:164)
>>>>>> ==6027==    by 0x95A1246: orte_grpcomm_base_pack_modex_entries (grpcomm_base_modex.c:861)
>>>>>> ==6027==    by 0xCDB1E3C: modex (grpcomm_bad_module.c:563)
>>>>>> ==6027==    by 0x95436F1: ompi_mpi_init (ompi_mpi_init.c:682)
>>>>>> ==6027==    by 0x955938F: PMPI_Init (pinit.c:84)
>>>>>> ==6027==    by 0x8AB4FF4: MPI_INIT (pinit_f.c:75)
>>>>>> ==6027==    by 0x50ED97E: petscinitialize_ (zstart.c:299)
>>>>>> ==6027==    by 0x40881C: MAIN__ (ex11f90.F:43)
>>>>>> ==6027==    by 0x4087AB: main (in /home/wtay/Codes/petsc-dev/src/dm/examples/tutorials/ex11f90)
>>>>>> ==6027==
>>>>>> ==6027== Invalid read of size 8
>>>>>> ==6027==    at 0x50FC6D8: vecview_ (zvectorf.c:56)
>>>>>> ==6027==    by 0x408A05: MAIN__ (ex11f90.F:56)
>>>>>> ==6027==    by 0xEE1CC2F: ???
>>>>>> ==6027==    by 0xEE5A0AF: ???
>>>>>> ==6027==    by 0x6F5C9F: ??? (in /home/wtay/Codes/petsc-dev/src/dm/examples/tutorials/ex11f90)
>>>>>> ==6027==    by 0x4962FF: ??? (in /home/wtay/Codes/petsc-dev/src/dm/examples/tutorials/ex11f90)
>>>>>> ==6027==    by 0x6F5C9F: ??? (in /home/wtay/Codes/petsc-dev/src/dm/examples/tutorials/ex11f90)
>>>>>> ==6027==    by 0x7FEFFFC13: ???
>>>>>> ==6027==  Address 0xfffffffffffffeb8 is not stack'd, malloc'd or (recently) free'd
>>>>>> ==6027==
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>>> 
>>>>>>> Supposed I have a field - u,v,w,p, so in order to use them, I do the following:
>>>>>>> 
>>>>>>> type field
>>>>>>> 
>>>>>>> real u,v,w,p
>>>>>>> 
>>>>>>> end type field
>>>>>>> 
>>>>>>> type(field), pointer :: field1(:,:,:)    ->   make a derived variable
>>>>>>> 
>>>>>>> Also:
>>>>>>> 
>>>>>>> Vec field_local,field_global
>>>>>>> 
>>>>>>> call DMDACreate3d with dof = 4
>>>>>>> 
>>>>>>> call DMCreateGlobalVector(da, field_local,ierr)
>>>>>>> 
>>>>>>> call DMCreateLocalVector(da,field_global,ierr)
>>>>>>> 
>>>>>>> call DMGetLocalVector(da, field_local,ierr)    ->      To insert values
>>>>>>> 
>>>>>>> call DMDAVecGetArrayF90(da, field_local,field1,ierr)
>>>>>>> 
>>>>>>> do k = zs, zs + zm - 1
>>>>>>> 
>>>>>>> do j = ys, ys + ym -1
>>>>>>> 
>>>>>>>     do i = xs, xs + xm - 1
>>>>>>> 
>>>>>>>         field1(i,j,k)%u = ...     ->      evaluate u,v,w,p etc
>>>>>>> 
>>>>>>>     end do
>>>>>>> 
>>>>>>> end do
>>>>>>> 
>>>>>>> call DMDAVecRestoreArrayF90(da,field_local,field1,ierr)
>>>>>>> 
>>>>>>> call DMLocalToGlobalBegin(da,field_local,INSERT_VALUES,field_global,ierr)
>>>>>>> 
>>>>>>> call DMLocalToGlobalEnd(da,field_local,INSERT_VALUES,field_global,ierr)
>>>>>>> 
>>>>>>> call DMRestoreLocalVector(da,field_local,ierr)
>>>>>>> 
>>>>>>> Is this the correct way?
>>>>>>> 
>>>>>>> Also, supposed I now want to solve my u,v,w momentum eqns. Although they're not coupled together, I believe it's faster if I assemble them into 1 big matrix.
>>>>>>> 
>>>>>>> So for Ax = b, x = (field(1,1,1)%u,field(1,1,1)%v,field(1,1,1)%w,field(2,1,1)%u.... )
>>>>>>> 
>>>>>>> For b, do I duplicate a Vec similar to field_local?
>>>>>>> 
>>>>>>> What about matrix A? Do I use the MatSetValuesStencil?
>>>>>>> 
>>>>>>> Yes
>>>>>>> 
>>>>>>> 
>>>>>>> Lastly, the type field contains u,v,w and p. However, I'm only solving u,v,w. Do I have to skip some values or use identity matrix to solve it?
>>>>>>> 
>>>>>>> Why not make a field that contains only u,v,w. I don't see what you're trying to do.
>>>>>> 
>>>>>> -- 
>>>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>>>>> -- Norbert Wiener



More information about the petsc-users mailing list