[petsc-users] [SPAM] Re: [SPAM] user experience with PCNN

Jakub Sistek sistek at math.cas.cz
Mon Oct 3 14:40:41 CDT 2011


Dear Jed,

thank you for your quick response and answers. And I am very pleased by 
your interest in BDDC :-)

I would be also very happy if we could make a native implementation of 
BDDC into PETSc together and have been thinking of something like that 
several times. However, I can see some issues with this, which might be 
caused only by my limited knowledge of PETSc. I would love to know your 
comments to them.

One thing I particularly enjoy on PETSc is the quick interchangeability 
of preconditioners and Krylov methods within the KSP object. But I can 
see this possible through strictly algebraic nature of the approach, 
where only matrix object is passed.
On the other hand, all of the FETI-DP and BDDC implementations I have 
heard of are related to FEM computations and make the mesh somewhat 
accessible to the solver. Although I do not like this, also my third 
generation of implementation of the BDDC method still needs some limited 
information on geometry. Not really for construction of the coarse basis 
functions (this is algebraic in BDDC), but rather indirectly for the 
selection of coarse degrees of freedom. I am not aware of any existing 
approach to selection of coarse DOFs at the moment, that would not 
require some information on geometry for robust selection on 
unstructured 3D meshes. I could imagine that the required information 
could be limited to positions of unknowns and some information of the 
problem which is solved (the nullspace size), the topology of the mesh 
is not really necessary.
For this difficulty, I do not see it simple to write something like 
PCBDDC preconditioner that would simply interchange with PCASM and 
others. The situation would be simpler for BDDC if the preconditioner 
could use also some kind of mesh description.

The other issue I can see to be a bit conflicting with the KSP approach 
of PETSc might be the fact, that BDDC implementations introduce some 
coupling between preconditioner and Krylov method, which is in fact run 
only for the Schur complement problem at the interface among subdomains. 
Multiplication by the system matrix in Krylov method is performed by 
Dirichlet solves on each subdomain, which corresponds to passing a 
special matrix-vector multiplying routine to the Krylov method - at 
least, this is the approach I follow in my last implementation of BDDC, 
in the BDDCML code, where essentially the preconditioner provides the 
A*x function to the Krylov method.
I have seen this circumvented in PCNN by resolving the vectors to the 
original size after each application of the preconditioner, but in my 
opinion, this approach then loses some of the efficiency of running 
Krylov method on the Schur complement problem instead of the original 
problem, which usually has a great effect on convergence by itself.

Regarding problem types, I have little experience with using BDDC beyond 
Poisson problems and elasticity. Recently, I have done some tests with 
Stokes problems and incompressible Navier-Stokes problems, using "brute 
force" rather than any delicacy you may have in mind. The initial 
experience with Stokes problem using Taylor-Hood elements is quite good, 
things get worse for Navier-Stokes where the convergence, with the 
current simple coarse problem, deteriorates quickly with increasing 
Reynolds number. However, all these things should be better tested and, 
as you probably know, are rather recent topic of research and no clear 
conclusions have been really achieved.

I am looking forward to your comments. It would certainly be great if we 
could make BDDC work within PETSc. Let us see if we can overcome these 
issues...

Jakub



On 10/03/2011 05:44 PM, Jed Brown wrote:
> On Mon, Oct 3, 2011 at 10:13, Jakub Sistek <sistek at math.cas.cz 
> <mailto:sistek at math.cas.cz>> wrote:
>
>     Dear PETSc developers and users,
>
>     we are using PETSc to solve systems arising from mixed-hybrid FEM
>     applied to Darcy flow. First, the saddle point system is reduced
>     to the Schur complement problem for Lagrange multipliers on
>     element interfaces, which is then symmetric positive definite.
>     Currently, we are using the PCASM preconditioner to solve it. We
>     have very positive experience (also from other projects) with
>     PCASM, but we have observed some worsening of convergence and
>     scalability with going to larger number of processors (up to 64)
>     here. As far as we understand, the increasing number of iterations
>     may be caused by the lack of coarse correction in the
>     implementation of the preconditioner. On the other hand, PCNN
>     should contain such a coarse solve. I have modified our FEM code
>     to support MATIS matrices besides MPIAIJ, but so far have a mixed
>     experience with PCNN. It seems to work on 2 CPUs, but complains
>     about singular local problems (solved by MUMPS) on more. After
>     some time spent by debugging ( though there are probably still
>     many bugs left in my code ) and unsuccessful playing with some of
>     the related options ( -pc_is_damp_fixed,
>      -pc_is_set_damping_factor_floating, etc.) I have decided to ask
>     couple of questions before I will continue in further
>     investigation why PCNN does not work for me for general case:
>
>     1) Am I right that PCNN is the only domain decomposition method
>     exploiting coarse correction readily available in PETSc?
>
>
> Multilevel Schwarz methods can be used through PCMG. A two-level 
> example, based on the so-called wirebasket coarse spaces, is PCEXOTIC. 
> This code was basically an experiment that only works for scalar 
> problems on structured grids (using DMDA). It could certainly be 
> generalized.
>
> The new PCGAMG is not a DD method per-se, but it is developing a lot 
> of hooks with many ideas that overlap with modern DD methods.
>
>     2) As PCNN seems much less documented (I have found no example or
>     so) than other preconditioners, I would simply like to know if
>     someone else uses it and have positive experience with this
>     implementation?
>
>
> Unfortunately, PCNN was more of a research project than robust or 
> widely used code. The coarse space used by the current code is only 
> suitable for scalar problems. The method uses coarse spaces that are 
> more awkward to generalize than the more recently developed balancing 
> and dual methods like BDDC/FETI-DP.
>
>     3) What may be proper options for stabilizing solutions of the
>     local problems?
>     4) Are there limitations to the method with respect to nullspace
>     type of subdomain problems, i.e. equation?
>     5) Do these answers depend on version of PETSc? (I have played
>     with 3.0, would things be different with 3.2 ?)
>
>
> PCNN specifically is no different in 3.2, but there are many other 
> improvements, so I recommend upgrading. One notable improvement is 
> that now you can use MatGetLocalSubMatrix() as part of assembly, such 
> that exactly the same code can assemble into MATIS, MATAIJ, MATNEST, etc.
>
>
>     In the long run, I would like to connect the FEM code to an own
>     solver based on BDDC domain decomposition, for which the MATIS
>     matrix seems as a natural format and connection should be
>     straightforward. However, I would like to make it work with PCNN
>     as well.
>
>
> I have been meaning to add BDDC to PETSc for a long time, but it 
> hasn't worked its way up to top priority yet. I have read some of your 
> papers on the subject and would be happy to work with you on a native 
> PETSc implementation. (My interest is largely on use with mixed 
> discretizations for incompressible problems, so it's somewhat more 
> delicate than non-mixed elasticity.)
>
> I would not worry about PCNN.


-- 
Jakub Sistek, Ph.D.
postdoctoral researcher
Institute of Mathematics of the AS CR
http://www.math.cas.cz/~sistek/
tel: (+420) 222 090 710

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20111003/88f8160f/attachment.htm>


More information about the petsc-users mailing list