[petsc-users] DMPlexProjectField not including BC constraints?

Matthew Knepley knepley at gmail.com
Fri Apr 24 17:41:48 CDT 2015


On Fri, Apr 24, 2015 at 5:38 PM, Justin Chang <jchang27 at uh.edu> wrote:

> I actually made a mistake in PetscFECreateDefault(...), if that's what
> you're referring to, and put dm instead of user.dmVel. However the problem
> remains.
>
> Everything is a scalar (i.e., numComp is 1 for both DMs). dm and
> user.dmVel should have the same chart but user.dmVel does not invoke
> DMPlexAddBoundary(..). Each spatial index of u_x[] will be printed
> separately. Right now I am simply working with the x component.
>

Okay, if everything is scalar, then it should work. Can you make a small
example and I will debug it.

  Thanks,

     Matt


> That said, is it possible to get this sort of mapping?
>
> Thanks,
>
> On Fri, Apr 24, 2015 at 8:09 AM, Matthew Knepley <knepley at gmail.com>
> wrote:
>
>> On Thu, Apr 23, 2015 at 10:20 PM, Justin Chang <jchang27 at uh.edu> wrote:
>>
>>> Matt,
>>>
>>> I have attempted this but I am getting an error when trying to output
>>> the vector into HDF5. This is my implementation:
>>>
>>> /* Initialize */
>>> PetscDS prob;
>>> Vec vlocal, ulocal, vglobal;
>>> void (*vX[1])(....) = {velx};
>>>
>>> /* Create DM for velocity */
>>> ierr = DMClone(dm,&user.dmVel);CHKERRQ(ierr);
>>> ierr = DMPlexCopyCoordinates(dm, user.dmVel);CHKERRQ(ierr);
>>> ierr = PetscFECreateDefault(dm, spatialDim, 1, PETSC_TRUE, "solution_",
>>> -1, &user.feVel);CHKERRQ(ierr);
>>> ierr = DMGetDS(user.dmVel, &prob);CHKERRQ(ierr);
>>> ierr = PetscDSSetDiscretization(prob, 0, (PetscObject)
>>> user.feVel);CHKERRQ(ierr);
>>> ierr = DMCreateGlobalVector(user.dmVel,&vglobal);CHKERRQ(ierr);
>>> ierr = PetscObjectSetName((PetscObject) vglobal,
>>> "velocity");CHKERRQ(ierr);
>>>
>>> /* Obtain velocity */
>>> ierr = DMGetLocalVector(dm,&ulocal);CHKERRQ(ierr);
>>> ierr = VecDuplicate(ulocal,&vlocal);CHKERRQ(ierr);
>>> ierr =
>>> DMGlobalToLocalBegin(dm,user.u,INSERT_ALL_VALUES,ulocal);CHKERRQ(ierr);
>>> ierr =
>>> DMGlobalToLocalEnd(dm,user.u,INSERT_ALL_VALUES,ulocal);CHKERRQ(ierr);
>>> ierr =
>>> DMPlexProjectFieldLocal(dm,ulocal,vX,INSERT_ALL_VALUES,vlocal);CHKERRQ(ierr);
>>> ierr =
>>> DMLocalToGlobalBegin(dm,vlocal,INSERT_ALL_VALUES,vglobal);CHKERRQ(ierr);
>>> ierr =
>>> DMLocalToGlobalEnd(dm,vlocal,INSERT_ALL_VALUES,vglobal);CHKERRQ(ierr);
>>>
>>
>> 1) Shouldn't that be user.dmVel above, not dm?
>>
>> 2) Are you changing from vector to scalar here? I think so, and so you
>> would need an extra step
>>      since that projection assumes that the layout is the same as the
>> input.
>>
>> I do not currently have anything that maps between different layouts like
>> you want. It will have to
>> be written. THEN we can use the above code.
>>
>>     Matt
>>
>>
>>> ierr = VecViewFromOptions(vglobal,NULL,"-vec_view_x");CHKERRQ(ierr);
>>>
>>> And as soon as I get to the last line, it gives me this error:
>>>
>>>
>>> [0]PETSC ERROR: PetscMallocValidate: error detected at
>>> PetscViewerFileSetName_HDF5() line 251 in
>>> /home/justin/petsc-dev/src/sys/classes/viewer/impls/hdf5/hdf5v.c
>>> [0]PETSC ERROR: Memory at address 0x3077ac0 is corrupted
>>> [0]PETSC ERROR: Probably write past beginning or end of array
>>> [0]PETSC ERROR: Last intact block allocated in VecCreate_Seq() line 38
>>> in /home/justin/petsc-dev/src/vec/vec/impls/seq/bvec3.c
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: Memory corruption:
>>> http://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind
>>> [0]PETSC ERROR:
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.5.3-2766-g0d12e5d
>>> GIT Date: 2015-04-21 21:32:23 -0500
>>> [0]PETSC ERROR: ./main on a arch-linux2-c-debug named pacotaco by justin
>>> Thu Apr 23 22:14:21 2015
>>> [0]PETSC ERROR: Configure options --download-chaco --download-exodusii
>>> --download-fblaslapack --download-hdf5 --download-metis --download-mpich
>>> --download-mumps --download-netcdf --download-parmetis --download-scalapack
>>> --download-tetgen --download-triangle --with-cc=gcc --with-clanguage=cxx
>>> --with-cmake=cmake --with-ctetgen --with-cxx=g++ --with-debugging=1
>>> --with-fc=gfortran --with-valgrind=1 PETSC_ARCH=arch-linux2-c-debug
>>> [0]PETSC ERROR: #1 PetscMallocValidate() line 135 in
>>> /home/justin/petsc-dev/src/sys/memory/mtr.c
>>> [0]PETSC ERROR: #2 PetscViewerFileSetName_HDF5() line 251 in
>>> /home/justin/petsc-dev/src/sys/classes/viewer/impls/hdf5/hdf5v.c
>>> [0]PETSC ERROR: #3 PetscViewerFileSetName() line 623 in
>>> /home/justin/petsc-dev/src/sys/classes/viewer/impls/ascii/filev.c
>>> [0]PETSC ERROR: #4 PetscOptionsGetViewer() line 130 in
>>> /home/justin/petsc-dev/src/sys/classes/viewer/interface/viewreg.c
>>> [0]PETSC ERROR: #5 PetscObjectViewFromOptions() line 2622 in
>>> /home/justin/petsc-dev/src/sys/objects/options.c
>>> [0]PETSC ERROR: #6 main() line 1237 in
>>> /home/justin/Dropbox/dmplex-nonneg/main.c
>>> [0]PETSC ERROR: PETSc Option Table entries:
>>> [0]PETSC ERROR: -al 100
>>> [0]PETSC ERROR: -at 0.01
>>> [0]PETSC ERROR: -bcloc
>>> 496000,496000,536000,542000,0,100,503000,503000,536000,542000,0,100,496000,503000,536000,536000,0,100,496000,503000,542000,542000,0,100
>>> [0]PETSC ERROR: -bcnum 4
>>> [0]PETSC ERROR: -bcval 1,1,1,1
>>> [0]PETSC ERROR: -dim 3
>>> [0]PETSC ERROR: -dm_refine 0
>>> [0]PETSC ERROR: -dt 0.01
>>> [0]PETSC ERROR: -edges 3,3,3
>>> [0]PETSC ERROR: -floc 499550,499600,538950,539000,0,100
>>> [0]PETSC ERROR: -flow datafiles/chrom_flow.dat
>>> [0]PETSC ERROR: -fnum 1
>>> [0]PETSC ERROR: -ftime 0,99
>>> [0]PETSC ERROR: -fval -0.1
>>> [0]PETSC ERROR: -ksp_rtol 1.0e-9
>>> [0]PETSC ERROR: -ksp_type cg
>>> [0]PETSC ERROR: -lower 0,0,0
>>> [0]PETSC ERROR: -mat_petscspace_order 0
>>> [0]PETSC ERROR: -mesh datafiles/chrom_mesh.dat
>>> [0]PETSC ERROR: -mu 3.95e-5
>>> [0]PETSC ERROR: -nonneg 0
>>> [0]PETSC ERROR: -numsteps 0
>>> [0]PETSC ERROR: -options_left 0
>>> [0]PETSC ERROR: -pc_type jacobi
>>> [0]PETSC ERROR: -progress 1
>>> [0]PETSC ERROR: -solution_petscspace_order 1
>>> [0]PETSC ERROR: -tao_type tron
>>> [0]PETSC ERROR: -upper 1,1,1
>>> [0]PETSC ERROR: -vec_view_x hdf5:solx.h5
>>> [0]PETSC ERROR: -vec_view_y hdf5:soly.h5
>>> [0]PETSC ERROR: -vec_view_z hdf5:solz.h5
>>> [0]PETSC ERROR: -vtuname pressure
>>> [0]PETSC ERROR: -vtuprintat 1,20,80,160,320,640,999
>>> [0]PETSC ERROR: -vtusteps 7
>>> [0]PETSC ERROR: ----------------End of Error Message -------send entire
>>> error message to petsc-maint at mcs.anl.gov----------
>>>
>>> Any idea what's going on here? Also, if I replace the last line with
>>> VecView(vglobal,PETSC_VIEWER_STDOUT_WORLD) I get a completely wacko error:
>>>
>>> Vec Object:@�� ��r?������b? � jүr?�.
>>>
>>> ~E�b?�\2p}^r?���"�.b?'xDs��q?�|n��a?`K� Lwq?�n��a?�Ψ���p?�g◻u`?!�
>>>
>>> |�
>>>
>>> p?;� :2_?��� �5n?�6/w $]?��I!}�k?! c /�Z?�s# li?�=Y�� X?I%
>>> �[�f?^�u=��T?�Sx��Yc?�F �u�Q?6
>>>                                        e��_?;6��a3L? z_�R X?��
>>> �l�D?X�(�(�P?= �/k�9?;z� I B?
>>>
>>> R�ٕp$?� j�-� ?��7��- ��H>�h�:�� ��6 5�{��2@�L��]P�I#B�<U��U�O�fV`oI�7trdj�\��\��w0P��)e���a�Kf�s�uS�K���z
>>> e���� �V� ΏY�g� �3�2WY��Xëj�)�&M�[�s�!Q&+m��*�l�V^�Ӈй�uo�� ڄD`�z 1 MPI
>>> processes
>>>   type: seq
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>>> probably memory access out of range
>>> [0]PETSC ERROR: Try option -start_in_debugger or
>>> -on_error_attach_debugger
>>> [0]PETSC ERROR: or see
>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>>> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
>>> OS X to find memory corruption errors
>>> [0]PETSC ERROR: PetscMallocValidate: error detected at
>>> PetscSignalHandlerDefault() line 149 in
>>> /home/justin/petsc-dev/src/sys/error/signal.c
>>> [0]PETSC ERROR: Memory at address 0x2575ac0 is corrupted
>>> [0]PETSC ERROR: Probably write past beginning or end of array
>>> [0]PETSC ERROR: Last intact block allocated in VecCreate_Seq() line 38
>>> in /home/justin/petsc-dev/src/vec/vec/impls/seq/bvec3.c
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: Memory corruption:
>>> http://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind
>>> [0]PETSC ERROR:
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.5.3-2766-g0d12e5d
>>> GIT Date: 2015-04-21 21:32:23 -0500
>>> [0]PETSC ERROR: ./main on a arch-linux2-c-debug named pacotaco by justin
>>> Thu Apr 23 22:17:26 2015
>>> [0]PETSC ERROR: Configure options --download-chaco --download-exodusii
>>> --download-fblaslapack --download-hdf5 --download-metis --download-mpich
>>> --download-mumps --download-netcdf --download-parmetis --download-scalapack
>>> --download-tetgen --download-triangle --with-cc=gcc --with-clanguage=cxx
>>> --with-cmake=cmake --with-ctetgen --with-cxx=g++ --with-debugging=1
>>> --with-fc=gfortran --with-valgrind=1 PETSC_ARCH=arch-linux2-c-debug
>>> [0]PETSC ERROR: #1 PetscMallocValidate() line 135 in
>>> /home/justin/petsc-dev/src/sys/memory/mtr.c
>>> [0]PETSC ERROR: #2 PetscSignalHandlerDefault() line 149 in
>>> /home/justin/petsc-dev/src/sys/error/signal.c
>>>
>>>
>>> However, if I run the program with debugging off (i.e.,
>>> --with-debugging=0), VecView gives me an answer that seems relatively
>>> correct.
>>>
>>> Any idea what's going on here?
>>>
>>> Thanks,
>>>
>>> On Thu, Apr 23, 2015 at 4:00 AM, Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>>
>>>> On Wed, Apr 22, 2015 at 9:57 PM, Justin Chang <jchang27 at uh.edu> wrote:
>>>>
>>>>> Hello,
>>>>>
>>>>> I am using DMPlexProjectField(...) to postprocess my final solution u
>>>>> into another vector sol. I have the following pointwise functions:
>>>>>
>>>>> /* x-component of gradient */
>>>>> void velx(const PetscScalar u[], const PetscScalar u_t[], const
>>>>> PetscScalar u_x[], const PetscScalar a[], const PetscScalar a_t[], const
>>>>> PetscScalar a_x[], const PetscReal x[], PetscScalar v[])
>>>>> {
>>>>>   v[0] = -a[0]*mu*u_x[0];
>>>>> }
>>>>>
>>>>> /* y-component of gradient */
>>>>> void vely(const PetscScalar u[], const PetscScalar u_t[], const
>>>>> PetscScalar u_x[], const PetscScalar a[], const PetscScalar a_t[], const
>>>>> PetscScalar a_x[], const PetscReal x[], PetscScalar v[])
>>>>> {
>>>>>   v[0] = -a[0]*mu*u_x[1];
>>>>> }
>>>>>
>>>>> /* z-component of gradient */
>>>>> void velz(const PetscScalar u[], const PetscScalar u_t[], const
>>>>> PetscScalar u_x[], const PetscScalar a[], const PetscScalar a_t[], const
>>>>> PetscScalar a_x[], const PetscReal x[], PetscScalar v[])
>>>>> {
>>>>>   v[0] = -a[0]*mu*u_x[2];
>>>>> }
>>>>>
>>>>> Where a[0] is the cell-wise permeability and mu is the (inverse of)
>>>>> viscosity. I am solving the diffusion equation (P1 elements). At the end of
>>>>> my simulation I have these lines:
>>>>>
>>>>> ierr = VecDuplicate(user.u,&user.sol);CHKERRQ(ierr);
>>>>> ierr = PetscObjectSetName((PetscObject) user.sol,
>>>>> "velocity");CHKERRQ(ierr);
>>>>> void (*vX[1])(const PetscScalar *, const PetscScalar *, const
>>>>> PetscScalar *, const PetscScalar *, const PetscScalar *, const PetscScalar
>>>>> *, const PetscReal *, PetscScalar *) = {velx};
>>>>> ierr =
>>>>> DMPlexProjectField(dm,user.u,vX,INSERT_ALL_VALUES,user.sol);CHKERRQ(ierr);
>>>>> ierr = VecViewFromOptions(user.sol,NULL,"-vec_view_x");CHKERRQ(ierr);
>>>>>
>>>>> void (*vY[1])(const PetscScalar *, const PetscScalar *, const
>>>>> PetscScalar *, const PetscScalar *, const PetscScalar *, const PetscScalar
>>>>> *, const PetscReal *, PetscScalar *) = {vely};
>>>>> ierr =
>>>>> DMPlexProjectField(dm,user.u,vY,INSERT_ALL_VALUES,user.sol);CHKERRQ(ierr);
>>>>> ierr = VecViewFromOptions(user.sol,NULL,"-vec_view_y");CHKERRQ(ierr);
>>>>>
>>>>> if (spatialDim == 3) {
>>>>>   void (*vZ[1])(const PetscScalar *, const PetscScalar *, const
>>>>> PetscScalar *, const PetscScalar *, const PetscScalar *, const PetscScalar
>>>>> *, const PetscReal *, PetscScalar *) = {velz};
>>>>>   ierr =
>>>>> DMPlexProjectField(dm,user.u,vZ,INSERT_ALL_VALUES,user.sol);CHKERRQ(ierr);
>>>>>   ierr = VecViewFromOptions(user.sol,NULL,"-vec_view_z");CHKERRQ(ierr);
>>>>> }
>>>>> ierr = VecDestroy(&user.sol);CHKERRQ(ierr);
>>>>>
>>>>> But when i look at the results, the dirichlet constrained values of u
>>>>> were projected into the respective locations in sol. I thought putting
>>>>> INSERT_ALL_VALUES would insert the pointwise functions into all locations
>>>>> including the constrained ones. How can I project the above pointwise
>>>>> functions into the constrained fields? Or what am I potentially missing
>>>>> here?
>>>>>
>>>>
>>>> There are no Dirichlet constrained values in the global vector
>>>> user.sol. When you call VecView() it gets projected to
>>>> an enlarged vector including the BC and output. What you want is to
>>>> project that vector into a space with no constraints,
>>>> so you really need another DM/Section combination.
>>>>
>>>> I think this will work:
>>>>
>>>>   1) Get a local vector from dm
>>>>
>>>>   2) Use DMPlexProjectFieldLocal(INSERT_ALL_VALUES) to fill it up
>>>>
>>>>   3) DMClone(dm) and give it a Section with no constraints
>>>>
>>>>   4) Use DMLocalToGlobal() into the new DM
>>>>
>>>>   5) VecView() that new global vector
>>>>
>>>>   Thanks,
>>>>
>>>>      Matt
>>>>
>>>>
>>>>> Thanks,
>>>>>
>>>>>
>>>>> --
>>>>> Justin Chang
>>>>> PhD Candidate, Civil Engineering - Computational Sciences
>>>>> University of Houston, Department of Civil and Environmental
>>>>> Engineering
>>>>> Houston, TX 77004
>>>>> (512) 963-3262
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>
>>>
>>>
>>> --
>>> Justin Chang
>>> PhD Candidate, Civil Engineering - Computational Sciences
>>> University of Houston, Department of Civil and Environmental Engineering
>>> Houston, TX 77004
>>> (512) 963-3262
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>
>
> --
> Justin Chang
> PhD Candidate, Civil Engineering - Computational Sciences
> University of Houston, Department of Civil and Environmental Engineering
> Houston, TX 77004
> (512) 963-3262
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150424/5df5c869/attachment-0001.html>


More information about the petsc-users mailing list