On Mon, Jun 18, 2012 at 7:30 PM, TAY wee-beng <span dir="ltr"><<a href="mailto:zonexo@gmail.com" target="_blank">zonexo@gmail.com</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div bgcolor="#FFFFFF" text="#000000">
<br>
On 17/6/2012 2:33 PM, Jed Brown wrote:
<blockquote type="cite">
<div class="gmail_quote">On Sun, Jun 17, 2012 at 7:26 AM, TAY
wee-beng <span dir="ltr"><<a href="mailto:zonexo@gmail.com" target="_blank">zonexo@gmail.com</a>></span>
wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div bgcolor="#FFFFFF" text="#000000">
<div> <br>
On 16/6/2012 9:24 AM, Jed Brown wrote:
<blockquote type="cite">
<p>It depends how you want to solve the problem. I
usually group all dofs together. There is a 2D
Stokes+Thermodynamics example in SNES ex30 (or 31?).</p>
</blockquote>
<br>
</div>
I tried to understand ex30. I have some questions since it's
in C and I'm using to Fortran programming.<br>
</div>
</blockquote>
<div><br>
</div>
<div>This looks about right, see
src/dm/examples/tutorials/ex11f90.F.</div>
</div>
</blockquote>
<br>
I tried to build and run ex11f90 in linux. However, it's not
working:<br>
<br>
I have attached the run and valgrind output:<br></div></blockquote><div><br></div><div>It appears to be a problem with your MPI.</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div bgcolor="#FFFFFF" text="#000000">
run output:<br>
<i><br>
[wtay@hpc12:tutorials]$ ./ex11f90 <br>
[0]PETSC ERROR:
------------------------------------------------------------------------<br>
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
Violation, probably memory access out of range<br>
[0]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger<br>
[0]PETSC ERROR: or see
<a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a>[0]PETSC
ERROR: or try <a href="http://valgrind.org" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X
to find memory corruption errors<br>
[0]PETSC ERROR: likely location of problem given in stack below<br>
[0]PETSC ERROR: --------------------- Stack Frames
------------------------------------<br>
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
available,<br>
[0]PETSC ERROR: INSTEAD the line number of the start of the
function<br>
[0]PETSC ERROR: is given.<br>
[0]PETSC ERROR: --------------------- Error Message
------------------------------------<br>
[0]PETSC ERROR: Signal received!<br>
[0]PETSC ERROR:
------------------------------------------------------------------------<br>
[0]PETSC ERROR: Petsc Development HG revision:
f3b998b41b349e16d47fe42b0e223d3462737e05 HG Date: Fri Jun 15
17:50:32 2012 -0500<br>
[0]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
[0]PETSC ERROR: See docs/faq.html for hints about trouble
shooting.<br>
[0]PETSC ERROR: See docs/index.html for manual pages.<br>
[0]PETSC ERROR:
------------------------------------------------------------------------<br>
[0]PETSC ERROR: ./ex11f90 on a petsc-3.3 named hpc12 by wtay Tue
Jun 19 03:22:52 2012<br>
[0]PETSC ERROR: Libraries linked from
/home/wtay/Lib/petsc-3.3-dev_shared_debug/lib<br>
[0]PETSC ERROR: Configure run at Sun Jun 17 16:51:29 2012<br>
[0]PETSC ERROR: Configure options
--with-mpi-dir=/opt/openmpi-1.5.3/
--with-blas-lapack-dir=/opt/intelcpro-11.1.059/mkl/lib/em64t/
--with-debugging=1 --download-hypre=1
--prefix=/home/wtay/Lib/petsc-3.3-dev_shared_debug
--known-mpi-shared=1 --with-shared-libraries<br>
[0]PETSC ERROR:
------------------------------------------------------------------------<br>
[0]PETSC ERROR: User provided function() line 0 in unknown
directory unknown file<br>
--------------------------------------------------------------------------<br>
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD <br>
with errorcode 59.<br>
<br>
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI
processes.<br>
You may or may not see output from other processes, depending on<br>
exactly when Open MPI kills them.<br>
--------------------------------------------------------------------------</i><br>
<br>
valgrind output:<br>
<i><br>
==6027== Memcheck, a memory error detector<br>
==6027== Copyright (C) 2002-2011, and GNU GPL'd, by Julian Seward
et al.<br>
==6027== Using Valgrind-3.7.0 and LibVEX; rerun with -h for
copyright info<br>
==6027== Command: ex11f90<br>
==6027== <br>
==6027== Invalid read of size 8<br>
==6027== at 0xA60148D: _wordcopy_fwd_dest_aligned (in
/lib64/<a href="http://libc-2.12.so" target="_blank">libc-2.12.so</a>)<br>
==6027== by 0xA5FB11D: __GI_memmove (in /lib64/<a href="http://libc-2.12.so" target="_blank">libc-2.12.so</a>)<br>
==6027== by 0xA6027DB: argz_insert (in /lib64/<a href="http://libc-2.12.so" target="_blank">libc-2.12.so</a>)<br>
==6027== by 0x95CAF25: lt_argz_insert (ltdl.c:1679)<br>
==6027== by 0x95CB7D0: foreachfile_callback (ltdl.c:1718)<br>
==6027== by 0x95CB4F1: foreach_dirinpath (ltdl.c:710)<br>
==6027== by 0x95CB580: lt_dlforeachfile (ltdl.c:1865)<br>
==6027== by 0x95DB999: mca_base_component_find
(mca_base_component_find.c:301)<br>
==6027== by 0x95DC4B0: mca_base_components_open
(mca_base_components_open.c:128)<br>
==6027== by 0x95F7CE7: opal_paffinity_base_open
(paffinity_base_open.c:112)<br>
==6027== by 0x95C39FE: opal_init (opal_init.c:307)<br>
==6027== by 0x95815A4: orte_init (orte_init.c:78)<br>
==6027== Address 0xb39d9a8 is 40 bytes inside a block of size 43
alloc'd<br>
==6027== at 0x4C267BA: malloc (vg_replace_malloc.c:263)<br>
==6027== by 0x95CA358: lt__malloc (lt__alloc.c:54)<br>
==6027== by 0x95CB751: foreachfile_callback (ltdl.c:1764)<br>
==6027== by 0x95CB4F1: foreach_dirinpath (ltdl.c:710)<br>
==6027== by 0x95CB580: lt_dlforeachfile (ltdl.c:1865)<br>
==6027== by 0x95DB999: mca_base_component_find
(mca_base_component_find.c:301)<br>
==6027== by 0x95DC4B0: mca_base_components_open
(mca_base_components_open.c:128)<br>
==6027== by 0x95F7CE7: opal_paffinity_base_open
(paffinity_base_open.c:112)<br>
==6027== by 0x95C39FE: opal_init (opal_init.c:307)<br>
==6027== by 0x95815A4: orte_init (orte_init.c:78)<br>
==6027== by 0x95432AE: ompi_mpi_init (ompi_mpi_init.c:350)<br>
==6027== by 0x955938F: PMPI_Init (pinit.c:84)<br>
==6027== <br>
==6027== Syscall param writev(vector[...]) points to uninitialised
byte(s)<br>
==6027== at 0xA65692B: writev (in /lib64/<a href="http://libc-2.12.so" target="_blank">libc-2.12.so</a>)<br>
==6027== by 0xC9A2C56: mca_oob_tcp_msg_send_handler
(oob_tcp_msg.c:249)<br>
==6027== by 0xC9A417C: mca_oob_tcp_peer_send
(oob_tcp_peer.c:204)<br>
==6027== by 0xC9A67FC: mca_oob_tcp_send_nb (oob_tcp_send.c:167)<br>
==6027== by 0xC3953B5: orte_rml_oob_send (rml_oob_send.c:136)<br>
==6027== by 0xC3955FF: orte_rml_oob_send_buffer
(rml_oob_send.c:270)<br>
==6027== by 0xCDB1E87: modex (grpcomm_bad_module.c:573)<br>
==6027== by 0x95436F1: ompi_mpi_init (ompi_mpi_init.c:682)<br>
==6027== by 0x955938F: PMPI_Init (pinit.c:84)<br>
==6027== by 0x8AB4FF4: MPI_INIT (pinit_f.c:75)<br>
==6027== by 0x50ED97E: petscinitialize_ (zstart.c:299)<br>
==6027== by 0x40881C: MAIN__ (ex11f90.F:43)<br>
==6027== Address 0xed30cd1 is 161 bytes inside a block of size
256 alloc'd<br>
==6027== at 0x4C268B2: realloc (vg_replace_malloc.c:632)<br>
==6027== by 0x95C4F22: opal_dss_buffer_extend
(dss_internal_functions.c:63)<br>
==6027== by 0x95C5A64: opal_dss_copy_payload
(dss_load_unload.c:164)<br>
==6027== by 0x95A1246: orte_grpcomm_base_pack_modex_entries
(grpcomm_base_modex.c:861)<br>
==6027== by 0xCDB1E3C: modex (grpcomm_bad_module.c:563)<br>
==6027== by 0x95436F1: ompi_mpi_init (ompi_mpi_init.c:682)<br>
==6027== by 0x955938F: PMPI_Init (pinit.c:84)<br>
==6027== by 0x8AB4FF4: MPI_INIT (pinit_f.c:75)<br>
==6027== by 0x50ED97E: petscinitialize_ (zstart.c:299)<br>
==6027== by 0x40881C: MAIN__ (ex11f90.F:43)<br>
==6027== by 0x4087AB: main (in
/home/wtay/Codes/petsc-dev/src/dm/examples/tutorials/ex11f90)<br>
==6027== <br>
==6027== Invalid read of size 8<br>
==6027== at 0x50FC6D8: vecview_ (zvectorf.c:56)<br>
==6027== by 0x408A05: MAIN__ (ex11f90.F:56)<br>
==6027== by 0xEE1CC2F: ???<br>
==6027== by 0xEE5A0AF: ???<br>
==6027== by 0x6F5C9F: ??? (in
/home/wtay/Codes/petsc-dev/src/dm/examples/tutorials/ex11f90)<br>
==6027== by 0x4962FF: ??? (in
/home/wtay/Codes/petsc-dev/src/dm/examples/tutorials/ex11f90)<br>
==6027== by 0x6F5C9F: ??? (in
/home/wtay/Codes/petsc-dev/src/dm/examples/tutorials/ex11f90)<br>
==6027== by 0x7FEFFFC13: ???<br>
==6027== Address 0xfffffffffffffeb8 is not stack'd, malloc'd or
(recently) free'd<br>
==6027== </i><br>
<br>
<br>
<br>
<blockquote type="cite">
<div class="gmail_quote">
<div> </div>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div bgcolor="#FFFFFF" text="#000000"> <br>
Supposed I have a field - u,v,w,p, so in order to use them,
I do the following:<br>
<i><br>
type field<br>
<br>
real u,v,w,p<br>
<br>
end type field<br>
<br>
type(field), pointer :: field1(:,:,:) -> make a
derived variable</i><br>
<br>
Also:<br>
<i><br>
Vec field_local,field_global<br>
<br>
call DMDACreate3d with dof = 4<br>
<br>
call DMCreateGlobalVector(da, field_local,ierr)<br>
<br>
call DMCreateLocalVector(da,field_global,ierr)<br>
<br>
call DMGetLocalVector(da, field_local,ierr) -> To
insert values<br>
<br>
call DMDAVecGetArrayF90(da, field_local,field1,ierr)<br>
<br>
do k = zs, zs + zm - 1<br>
<br>
do j = ys, ys + ym -1 <br>
<br>
do i = xs, xs + xm - 1<br>
<br>
field1(i,j,k)%u = ... -> evaluate
u,v,w,p etc<br>
<br>
end do<br>
<br>
end do<br>
<br>
call DMDAVecRestoreArrayF90(da,field_local,field1,ierr)<br>
<br>
call
DMLocalToGlobalBegin(da,field_local,INSERT_VALUES,field_global,ierr)<br>
<br>
call
DMLocalToGlobalEnd(da,field_local,INSERT_VALUES,field_global,ierr)<br>
<br>
call DMRestoreLocalVector(da,field_local,ierr)</i><br>
<br>
Is this the correct way?<br>
<br>
Also, supposed I now want to solve my u,v,w momentum eqns.
Although they're not coupled together, I believe it's faster
if I assemble them into 1 big matrix. <br>
<br>
So for Ax = b, x =
(field(1,1,1)%u,field(1,1,1)%v,field(1,1,1)%w,field(2,1,1)%u....
) <br>
<br>
For b, do I duplicate a Vec similar to field_local?<br>
<br>
What about matrix A? Do I use the MatSetValuesStencil?<br>
</div>
</blockquote>
<div><br>
</div>
<div>Yes</div>
<div> </div>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div bgcolor="#FFFFFF" text="#000000"> <br>
Lastly, the type field contains u,v,w and p. However, I'm
only solving u,v,w. Do I have to skip some values or use
identity matrix to solve it?<br>
</div>
</blockquote>
<div><br>
</div>
<div>Why not make a field that contains only u,v,w. I don't see
what you're trying to do.</div>
</div>
</blockquote>
</div>
</blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>