[petsc-users] GAMG and linear elasticity
Mark F. Adams
mfadams at lbl.gov
Sun Sep 1 10:40:53 CDT 2013
On Sep 1, 2013, at 8:57 AM, Tabrez Ali <stali at geology.wisc.edu> wrote:
> OK I will fix it to only pass local values.
>
> What should I do when I have additional linear constraint equations which are implemented using Lagrange multipliers (I use it even for things like non-zero displacement/pressure BCs)?
>
> Also with ex56.c I dont see a huge see any difference when PCSetCoordinates is used. Here's the number of iterations I get when I use it as it is (Note the increasing 'ne' value; Only the final iteration count is shown)
>
The RHS/solution in ex56 does not have any rotational component so the rotation modes are not that important.
> stali at i5:~/petsc-3.3-p4/src/ksp/ksp/examples/tutorials$ mpirun -np 2 ./ex56 -ne 2 -alpha 1.e-3 -ksp_monitor
> 6 KSP Residual norm 7.616482944250e-05
> stali at i5:~/petsc-3.3-p4/src/ksp/ksp/examples/tutorials$ mpirun -np 2 ./ex56 -ne 4 -alpha 1.e-3 -ksp_monitor
> 8 KSP Residual norm 2.561028520143e-03
> stali at i5:~/petsc-3.3-p4/src/ksp/ksp/examples/tutorials$ mpirun -np 2 ./ex56 -ne 8 -alpha 1.e-3 -ksp_monitor
> 12 KSP Residual norm 6.460398845075e-03
> stali at i5:~/petsc-3.3-p4/src/ksp/ksp/examples/tutorials$ mpirun -np 2 ./ex56 -ne 16 -alpha 1.e-3 -ksp_monitor
> 15 KSP Residual norm 1.888183406824e-02
>
> And here's what I get if I comment out the line "ierr = PCSetCoordinates( pc, 3, m/3, coords );"
>
> stali at i5:~/petsc-3.3-p4/src/ksp/ksp/examples/tutorials$ mpirun -np 2 ./ex56 -ne 2 -alpha 1.e-3 -ksp_monitor
> [0]PCSetData_AGG bs=3 MM=81
> 4 KSP Residual norm 8.448311817203e-04
> stali at i5:~/petsc-3.3-p4/src/ksp/ksp/examples/tutorials$ mpirun -np 2 ./ex56 -ne 4 -alpha 1.e-3 -ksp_monitor
> [0]PCSetData_AGG bs=3 MM=375
> 7 KSP Residual norm 3.281335307043e-03
> stali at i5:~/petsc-3.3-p4/src/ksp/ksp/examples/tutorials$ mpirun -np 2 ./ex56 -ne 8 -alpha 1.e-3 -ksp_monitor
> [0]PCSetData_AGG bs=3 MM=2187
> 12 KSP Residual norm 4.324990561199e-03
> stali at i5:~/petsc-3.3-p4/src/ksp/ksp/examples/tutorials$ mpirun -np 2 ./ex56 -ne 16 -alpha 1.e-3 -ksp_monitor
> [0]PCSetData_AGG bs=3 MM=14739
> 17 KSP Residual norm 7.038154621679e-03
>
> Tabrez
>
>
> On 08/31/2013 04:06 PM, Mark F. Adams wrote:
>> On Aug 31, 2013, at 12:25 PM, Tabrez Ali<stali at geology.wisc.edu> wrote:
>>
>>> Hello
>>>
>>> So I used PCSetCoordinates and now GAMG seems to work really well in that the number of iterations are relatively constant. Here are the number of iterations on 4 cores
>>>
>>> DOF ASM GAMG
>>> 2187 15 22
>>> 14739 26 22
>>> 107811 51 29
>>>
>>> So in PCSetCoordinates the 'coords' array should include values for the ghost nodes as well or only those values that correspond to the local owned sol'n vector?
>> Local only.
>>
>>> In the experiment above I included values of the ghost nodes as well (just had to add a line in my existing code) and it seems to have worked fine.
>>>
>> You tacked it onto the end of the array and so no harm done, we just did not read it.
>>
>> And you might want to use MatNullSpaceCreateRigidBody to create these vectors from the coordinates. This would add one extra step but it 1) is the preferred way and 2) it sounds like you want to something like Stokes and you could run with modify the vectors from MatNullSpaceCreateRigidBody to do an all MG solver (and dump this fieldsplit crap :) SOR smoothers inode matrices are actually vertex blocked smoothers and so they are stable even though they have a zero on the diagonal (just order pressure last).
>>
>> I think Jed mentioned this to you but specifically you can take the vectors that come out of MatNullSpaceCreateRigidBody and think if it as a tall skinny matrix: 3*n x 6. For the 3x6 matrix for each (n) vertex, call this Q, create a 4x7 matrix:
>>
>> Q 0
>> 0 1.0
>>
>> and give that to GAMG (i.e., 7 vectors of size 4*n). This would be very interesting to see how it works compared to fieldsplit.
>>
>> Oh, and pressure has to be a vertex variable.
>>
>>> Thanks in advance
>>>
>>> Tabrez
>>>
>>> On 08/27/2013 03:15 PM, Jed Brown wrote:
>>>> Tabrez Ali<stali at geology.wisc.edu> writes:
>>>>
>>>>> Hello
>>>>>
>>>>> What is the proper way to use GAMG on a vanilla 3D linear elasticity
>>>>> problem. Should I use
>>>>>
>>>>> -pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1
>>>> Yeah, and only the first of these is needed because the others are
>>>> default with -pc_type gamg.
>>>>
>>>>> -pc_type fieldsplit -pc_fieldsplit_block_size 3 -fieldsplit_pc_type gamg
>>>>> -fieldsplit_pc_gamg_type agg -fieldsplit_pc_gamg_agg_nsmooths 1
>>>>>
>>>>> Do these options even make sense? With the second set of options the %
>>>>> increase in number of iterations with increasing problem size is lower
>>>>> than the first but not optimal.
>>>> And it's probably more expensive because it has to do inner solves.
>>>> Also, if you have less compressible regions, it will get much worse.
>>>>
>>>>> Also, ksp/ksp/examples/ex56 performs much better in that the number of
>>>>> iterations remain more or less constant unlike what I see with my own
>>>>> problem. What am I doing wrong?
>>>> You probably forgot to set the near null space. You can use
>>>> MatSetNearNullSpace (and maybe MatNullSpaceCreateRigidBody) or the more
>>>> hacky (IMO) PCSetCoordinates. It's important to have translational
>>>> *and* rotational modes in the near null space that GAMG uses to build a
>>>> coarse space.
>>>
>>> --
>>> No one trusts a model except the one who wrote it; Everyone trusts an observation except the one who made it- Harlow Shapley
>>>
>>>
>
More information about the petsc-users
mailing list