[petsc-users] using petsc tools to solve isolated irregular domains with finite difference
Matthew Knepley
knepley at gmail.com
Sun Oct 27 12:47:36 CDT 2013
On Sun, Oct 27, 2013 at 10:04 AM, Bishesh Khanal <bisheshkh at gmail.com>wrote:
>
>
>
> On Sun, Oct 27, 2013 at 2:53 PM, Matthew Knepley <knepley at gmail.com>wrote:
>
>> On Sat, Oct 26, 2013 at 8:27 AM, Bishesh Khanal <bisheshkh at gmail.com>wrote:
>>
>>>
>>>
>>>
>>> On Sat, Oct 26, 2013 at 2:57 PM, Matthew Knepley <knepley at gmail.com>wrote:
>>>
>>>> On Sat, Oct 26, 2013 at 3:12 AM, Bishesh Khanal <bisheshkh at gmail.com>wrote:
>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Oct 25, 2013 at 10:21 PM, Matthew Knepley <knepley at gmail.com>wrote:
>>>>>
>>>>>> On Fri, Oct 25, 2013 at 2:55 PM, Bishesh Khanal <bisheshkh at gmail.com>wrote:
>>>>>>
>>>>>>> On Fri, Oct 25, 2013 at 8:18 PM, Matthew Knepley <knepley at gmail.com>wrote:
>>>>>>>
>>>>>>>> On Fri, Oct 25, 2013 at 12:09 PM, Bishesh Khanal <
>>>>>>>> bisheshkh at gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Dear all,
>>>>>>>>> I would like to know if some of the petsc objects that I have not
>>>>>>>>> used so far (IS, DMPlex, PetscSection) could be useful in the following
>>>>>>>>> case (of irregular domains):
>>>>>>>>>
>>>>>>>>> Let's say that I have a 3D binary image (a cube).
>>>>>>>>> The binary information of the image partitions the cube into a
>>>>>>>>> computational domain and non-computational domain.
>>>>>>>>> I must solve a pde (say a Poisson equation) only on the
>>>>>>>>> computational domains (e.g: two isolated spheres within the cube). I'm
>>>>>>>>> using finite difference and say a dirichlet boundary condition
>>>>>>>>>
>>>>>>>>> I know that I can create a dmda that will let me access the
>>>>>>>>> information from this 3D binary image, get all the coefficients, rhs values
>>>>>>>>> etc using the natural indexing (i,j,k).
>>>>>>>>>
>>>>>>>>> Now, I would like to create a matrix corresponding to the laplace
>>>>>>>>> operator (e.g. with standard 7 pt. stencil), and the corresponding RHS that
>>>>>>>>> takes care of the dirchlet values too.
>>>>>>>>> But in this matrix it should have the rows corresponding to the
>>>>>>>>> nodes only on the computational domain. It would be nice if I can easily
>>>>>>>>> (using (i,j,k) indexing) put on the rhs dirichlet values corresponding to
>>>>>>>>> the boundary points.
>>>>>>>>> Then, once the system is solved, put the values of the solution
>>>>>>>>> back to the corresponding positions in the binary image.
>>>>>>>>> Later, I might have to extend this for the staggered grid case too.
>>>>>>>>> So is petscsection or dmplex suitable for this so that I can set
>>>>>>>>> up the matrix with something like DMCreateMatrix ? Or what would you
>>>>>>>>> suggest as a suitable approach to this problem ?
>>>>>>>>>
>>>>>>>>> I have looked at the manual and that led me to search for a
>>>>>>>>> simpler examples in petsc src directories. But most of the ones I
>>>>>>>>> encountered are with FEM (and I'm not familiar at all with FEM, so these
>>>>>>>>> examples serve more as a distraction with FEM jargon!)
>>>>>>>>>
>>>>>>>>
>>>>>>>> It sounds like the right solution for this is to use PetscSection
>>>>>>>> on top of DMDA. I am working on this, but it is really
>>>>>>>> alpha code. If you feel comfortable with that level of development,
>>>>>>>> we can help you.
>>>>>>>>
>>>>>>>
>>>>>>> Thanks, with the (short) experience of using Petsc so far and being
>>>>>>> familiar with the awesomeness (quick and helpful replies) of this mailing
>>>>>>> list, I would like to give it a try. Please give me some pointers to get
>>>>>>> going for the example case I mentioned above. A simple example of using
>>>>>>> PetscSection along with DMDA for finite volume (No FEM) would be great I
>>>>>>> think.
>>>>>>> Just a note: I'm currently using the petsc3.4.3 and have not used
>>>>>>> the development version before.
>>>>>>>
>>>>>>
>>>>>> Okay,
>>>>>>
>>>>>> 1) clone the repository using Git and build the 'next' branch.
>>>>>>
>>>>>
>>>>> I encountered errors when doing make on the 'next' branch. The errors
>>>>> are as follows (I tried attached the configure.log file but the email
>>>>> bounced back saying it awaits moderator approval for having too big an
>>>>> attachment, so I'm sending this one with only make.log attached. ) :
>>>>>
>>>>
>>>> They are fixed. Pull again and rebuild.
>>>>
>>>
>>> doing git pull in the next branch says "Already up-to-date.", I'm not
>>> sure if it should have said that then. Still tried ./configure and make
>>> again but returns the same error. Does it take sometime to get updated in
>>> the servers or do I need to do anything special again other than the
>>> following ? :
>>> git checkout next
>>> git pull
>>>
>>
>> Okay, it should be this
>>
>> git checkout next
>> git pull
>> make allfortranstubs
>> make
>>
>
> Thanks, but now it reports another error:
>
> CXX arch-linux2-cxx-debug/obj/src/dm/impls/redundant/dmredundant.o
> CXX
> arch-linux2-cxx-debug/obj/src/dm/impls/redundant/ftn-auto/dmredundantf.o
> CXX arch-linux2-cxx-debug/obj/src/dm/impls/plex/plexcreate.o
> src/dm/impls/da/hypre/mhyp.c: In function ‘PetscErrorCode
> MatSetupDM_HYPREStruct(Mat, DM)’:
> src/dm/impls/da/hypre/mhyp.c:485:58: error: invalid conversion from
> ‘PetscInt** {aka int**}’ to ‘const PetscInt** {aka const int**}’
> [-fpermissive]
> /home/bkhanal/Documents/softwares/petsc/include/petscdmda.h:68:27:
> error: initializing argument 3 of ‘PetscErrorCode
> DMDAGetGlobalIndices(DM, PetscInt*, const PetscInt**)’ [-fpermissive]
> src/dm/impls/da/hypre/mhyp.c: In function ‘PetscErrorCode
> MatSetupDM_HYPRESStruct(Mat, DM)’:
> src/dm/impls/da/hypre/mhyp.c:963:58: error: invalid conversion from
> ‘PetscInt** {aka int**}’ to ‘const PetscInt** {aka const int**}’
> [-fpermissive]
> /home/bkhanal/Documents/softwares/petsc/include/petscdmda.h:68:27:
> error: initializing argument 3 of ‘PetscErrorCode
> DMDAGetGlobalIndices(DM, PetscInt*, const PetscInt**)’ [-fpermissive]
>
Yes, one of the test branched had a problem with Hypre. I pushed a fix.
Please try again.
Thanks,
Matt
> gmake[2]: *** [arch-linux2-cxx-debug/obj/src/dm/impls/da/hypre/mhyp.o]
> Error 1
> gmake[2]: *** Waiting for unfinished jobs....
> gmake[2]: Leaving directory `/home/bkhanal/Documents/softwares/petsc'
> gmake[1]: *** [gnumake] Error 2
> gmake[1]: Leaving directory `/home/bkhanal/Documents/softwares/petsc'
> **************************ERROR*************************************
> Error during compile, check arch-linux2-cxx-debug/conf/make.log
> Send it and arch-linux2-cxx-debug/conf/configure.log to
> petsc-maint at mcs.anl.gov
> ********************************************************************
> make: *** [all] Error 1
>
>
>>
>> Thanks,
>>
>> Matt
>>
>>
>>>
>>>> Matt
>>>>
>>>>
>>>>> CXX arch-linux2-cxx-debug/obj/src/
>>>>> mat/order/ftn-auto/spectralf.o
>>>>> CXX
>>>>> arch-linux2-cxx-debug/obj/src/mat/order/ftn-custom/zsorderf.o
>>>>> src/mat/order/wbm.c: In function ‘PetscErrorCode
>>>>> MatGetOrdering_WBM(Mat, MatOrderingType, _p_IS**, _p_IS**)’:
>>>>> src/mat/order/wbm.c:12:24: warning: variable ‘cntl’ set but not used
>>>>> [-Wunused-but-set-variable]
>>>>> src/mat/order/wbm.c:15:36: warning: unused variable ‘num’
>>>>> [-Wunused-variable]
>>>>> src/mat/order/wbm.c:15:56: warning: variable ‘icntl’ set but not used
>>>>> [-Wunused-but-set-variable]
>>>>> src/mat/order/wbm.c:15:66: warning: unused variable ‘info’
>>>>> [-Wunused-variable]
>>>>> CXX arch-linux2-cxx-debug/obj/src/mat/matfd/fdmatrix.o
>>>>> src/mat/order/ftn-auto/spectralf.c: In function ‘void
>>>>> matcreatelaplacian_(Mat, PetscReal*, PetscBool*, _p_Mat**, int*)’:
>>>>> src/mat/order/ftn-auto/spectralf.c:44:40: error: ‘MatCreateLaplacian’
>>>>> was not declared in this scope
>>>>> gmake[2]: ***
>>>>> [arch-linux2-cxx-debug/obj/src/mat/order/ftn-auto/spectralf.o] Error 1
>>>>> gmake[2]: *** Waiting for unfinished jobs....
>>>>> src/mat/order/hslmc64.c: In function ‘PetscErrorCode HSLmc64AD(const
>>>>> PetscInt*, PetscInt*, PetscInt*, PetscInt*, const PetscInt*, const
>>>>> PetscInt*, PetscScalar*, PetscInt*, PetscInt*, PetscInt*, PetscInt*,
>>>>> PetscInt*, PetscScalar*, PetscInt*, PetscScalar*, PetscInt*)’:
>>>>> src/mat/order/hslmc64.c:332:21: warning: variable ‘warn1’ set but not
>>>>> used [-Wunused-but-set-variable]
>>>>> src/mat/order/hslmc64.c:332:28: warning: variable ‘warn2’ set but not
>>>>> used [-Wunused-but-set-variable]
>>>>> src/mat/order/hslmc64.c:332:35: warning: variable ‘warn4’ set but not
>>>>> used [-Wunused-but-set-variable]
>>>>> gmake[2]: Leaving directory `/home/bkhanal/Documents/softwares/petsc'
>>>>> gmake[1]: *** [gnumake] Error 2
>>>>> gmake[1]: Leaving directory `/home/bkhanal/Documents/softwares/petsc'
>>>>> **************************ERROR*************************************
>>>>> Error during compile, check arch-linux2-cxx-debug/conf/make.log
>>>>> Send it and arch-linux2-cxx-debug/conf/configure.log to
>>>>> petsc-maint at mcs.anl.gov
>>>>> ********************************************************************
>>>>> make: *** [all] Error 1
>>>>>
>>>>>
>>>>>
>>>>>>
>>>>>> 2) then we will need to create a PetscSection that puts unknowns
>>>>>> where you want them
>>>>>>
>>>>>> 3) Setup the solver as usual
>>>>>>
>>>>>> You can do 1) an 3) before we do 2).
>>>>>>
>>>>>>
>>>>>> If not, just put the identity into
>>>>>>>> the rows you do not use on the full cube. It will not hurt
>>>>>>>> scalability or convergence.
>>>>>>>>
>>>>>>>
>>>>>>> In the case of Poisson with Dirichlet condition this might be the
>>>>>>> case. But is it always true that having identity rows in the system matrix
>>>>>>> will not hurt convergence ? I thought otherwise for the following reasons:
>>>>>>> 1) Having read Jed's answer here :
>>>>>>> http://scicomp.stackexchange.com/questions/3426/why-is-pinning-a-point-to-remove-a-null-space-bad/3427#3427
>>>>>>>
>>>>>>
>>>>>> Jed is talking about a constraint on a the pressure at a point. This
>>>>>> is just decoupling these unknowns from the rest
>>>>>> of the problem.
>>>>>>
>>>>>>
>>>>>>> 2) Some observation I am getting (but I am still doing more
>>>>>>> experiments to confirm) while solving my staggered-grid 3D stokes flow with
>>>>>>> schur complement and using -pc_type gamg for A00 matrix. Putting the
>>>>>>> identity rows for dirichlet boundaries and for ghost cells seemed to have
>>>>>>> effects on its convergence. I'm hoping once I know how to use PetscSection,
>>>>>>> I can get rid of using ghost cells method for the staggered grid and get
>>>>>>> rid of the identity rows too.
>>>>>>>
>>>>>>
>>>>>> It can change the exact iteration, but it does not make the matrix
>>>>>> conditioning worse.
>>>>>>
>>>>>> Matt
>>>>>>
>>>>>>
>>>>>>> Anyway please provide me with some pointers so that I can start
>>>>>>> trying with petscsection on top of a dmda, in the beginning for
>>>>>>> non-staggered case.
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Bishesh
>>>>>>>
>>>>>>>>
>>>>>>>> Matt
>>>>>>>>
>>>>>>>>
>>>>>>>>> Thanks,
>>>>>>>>> Bishesh
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>>> experiments lead.
>>>>>>>> -- Norbert Wiener
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> What most experimenters take for granted before they begin their
>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>> experiments lead.
>>>>>> -- Norbert Wiener
>>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131027/b58312c1/attachment-0001.html>
More information about the petsc-users
mailing list