[petsc-users] Could DMComposite handle two vectors with different grid sizes.

(Rebecca) Xuefei YUAN xy2102 at columbia.edu
Tue Jul 27 22:55:06 CDT 2010


Dear Barry,

Yes, you are right. I looked up into the examples and found this one

http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/src/snes/examples/tutorials/ex31.c.html

However, I was not able to run this one with the options '-snes_mf',  
is there any problem on that?

Thanks a lot!

Rebecca
Quoting Barry Smith <bsmith at mcs.anl.gov>:

>
>   DMComposite doesn't say that the different DAs have to have the   
> same dimensions, they can have different sizes, number of grid   
> points etc
>
>    Barry
>
> On Jul 27, 2010, at 10:11 PM, (Rebecca) Xuefei YUAN wrote:
>
>> Dear Barry,
>>
>> I went over the mp1.c as the multi physics example. However, what I  
>>  concerned is not addressed in that example. My concern is that  
>> will  it be fine if the grid size for the different das are  
>> different.
>>
>> The different grid sizes of da_A and da_B are coming from the fact   
>> that they share the same physical domain, but the computational   
>> domain are slightly different due to that problem A has the Neumann  
>>  boundary condition (giving that the computational domain is half   
>> spacing extended from the physical domain, each point is expressed   
>> at the center of the cell) and problem B has the Dirichlet boundary  
>>  condition (giving that the computational domain is the same as the  
>>  physical domain, each point is expressed at the node of the cell.)  
>>  Thus, problem A ALWAYS needs 1 more grid in each direction than   
>> problem B at each level.
>>
>> If the coarsest grid has the difference 1(mA - mB = 1) in each   
>> direction, according to the algorithm for multigrid, the next finer  
>>  level has the difference of ((2*mA-1) - (2*mB-1) = 2(mA-mB) = 2)  
>> in  each direction, the next level has the difference of   
>> ((2*(2*mA-1)-1) - (2*(2*mB-1)-1) = 4*(mA-mB) = 4) in each   
>> direction. Because there is only half spacing being extended from   
>> the physical domain as the computational domain for problem A, the   
>> abandonNumber of grid is 1-1=0 for nl=1, 2-1=1 for nl=2 and 4-1=3   
>> for nl=3.
>>
>> To keep those extra points in the residual function and Jacobian, I  
>>  applied the Neumann boundary condition for those points (mirror   
>> with the interior points).
>>
>> Will this procedure be fine for such a multi physics problem?
>>
>> Thanks very much!
>>
>> Rebecca
>>
>>
>>
>>
>> Quoting Barry Smith <bsmith at mcs.anl.gov>:
>>
>>>
>>>
>>>   What you are doing below looks ok. This builds the DM and DMMG.   
>>>   See tutorials/multiphysics/mp1.c for an example that uses SNES  
>>> to   solve a nonlinear equation on the grids.
>>>
>>>   Barry
>>>
>>>
>>>
>>> On Jul 27, 2010, at 7:36 PM, (Rebecca) Xuefei YUAN wrote:
>>>
>>>> Dear all,
>>>>
>>>> If I have two different vectors (with different dof, stencil   
>>>> width  and grid sizes), could I use DMComposite to handle this?
>>>>
>>>> The following is the piece of code for assemble this DMComposite in DMMG.
>>>>
>>>> 	tempparameters.mxfield				= 5;
>>>> 	tempparameters.myfield				= 5;
>>>>
>>>> 	tempparameters.mxgrid	= tempparameters.mxfield+1;
>>>> 	tempparameters.mygrid	= tempparameters.myfield+1;
>>>>
>>>> 	ierr	= DMMGCreate(comm, tempparameters.numberOfLevels, &appCtx,   
>>>>  &dmmg);CHKERRQ(ierr);
>>>>
>>>> 	ierr	= DMCompositeCreate(comm,&packer);CHKERRQ(ierr);
>>>>
>>>> 	ierr	= DACreate2d(comm,DA_NONPERIODIC,DA_STENCIL_BOX,    
>>>> tempparameters.mxfield, tempparameters.myfield, PETSC_DECIDE,    
>>>> PETSC_DECIDE, 4, 3, PETSC_NULL, PETSC_NULL, &da_fff);CHKERRQ(ierr);
>>>> 	ierr	= DASetFieldName(da_fff, 0, "phi");CHKERRQ(ierr);
>>>> 	ierr	= DASetFieldName(da_fff, 1, "vz");CHKERRQ(ierr);
>>>> 	ierr	= DASetFieldName(da_fff, 2, "psi");CHKERRQ(ierr);
>>>> 	ierr	= DASetFieldName(da_fff, 3, "bz");CHKERRQ(ierr);
>>>> 	ierr	= DACreate2d(comm,DA_NONPERIODIC,DA_STENCIL_BOX,    
>>>> tempparameters.mxgrid, tempparameters.mygrid, PETSC_DECIDE,    
>>>> PETSC_DECIDE, 1, tempparameters.gridStencilWidth, PETSC_NULL,    
>>>> PETSC_NULL, &da_ggg);CHKERRQ(ierr);
>>>> 	ierr	= DASetFieldName(da_ggg, 0, "disp");CHKERRQ(ierr);
>>>>
>>>> 	ierr	= DMCompositeAddDM(packer,(DM)da_fff);CHKERRQ(ierr);
>>>> 	ierr	= DMCompositeAddDM(packer,(DM)da_ggg);CHKERRQ(ierr);
>>>> 	ierr	= DMCompositeAddArray(packer,0,1);CHKERRQ(ierr);
>>>>
>>>> 	ierr	= DMMGSetDM(dmmg, (DM)packer);CHKERRQ(ierr);
>>>>
>>>> 	ierr	= DADestroy(da_fff);CHKERRQ(ierr);
>>>> 	ierr	= DADestroy(da_ggg);CHKERRQ(ierr);
>>>>
>>>> .........................................................
>>>>
>>>> 	ierr	= DMCompositeDestroy(packer);CHKERRQ(ierr);
>>>> 	ierr	= DMMGDestroy(dmmg);CHKERRQ(ierr);
>>>>
>>>>
>>>> I understand that for numberOfLevels(nl)>1, the grid size    
>>>> difference is no longer 1, but 3(nl=2), 5(nl=3), could I use    
>>>> DMComposite to handle such a case successfully?
>>>>
>>>> I was not able to get examples for such a case, is there any?
>>>>
>>>> Thanks a lot!
>>>>
>>>>
>>>>
>>>> --
>>>> (Rebecca) Xuefei YUAN
>>>> Department of Applied Physics and Applied Mathematics
>>>> Columbia University
>>>> Tel:917-399-8032
>>>> www.columbia.edu/~xy2102
>>>>
>>>
>>>
>>>
>>
>>
>>
>> --
>> (Rebecca) Xuefei YUAN
>> Department of Applied Physics and Applied Mathematics
>> Columbia University
>> Tel:917-399-8032
>> www.columbia.edu/~xy2102
>>
>
>
>



-- 
(Rebecca) Xuefei YUAN
Department of Applied Physics and Applied Mathematics
Columbia University
Tel:917-399-8032
www.columbia.edu/~xy2102



More information about the petsc-users mailing list