Multilevel solver

Amit.Itagi at seagate.com Amit.Itagi at seagate.com
Wed Apr 23 13:32:11 CDT 2008


Barry,

Is the installation of petsc-dev different from the installation of the
2.3.3 release ? I ran the config. But the folder tree seems to be
different. Hence, make is giving problems.

Amit




                                                                           
             Barry Smith                                                   
             <bsmith at mcs.anl.g                                             
             ov>                                                        To 
             Sent by:                  petsc-users at mcs.anl.gov             
             owner-petsc-users                                          cc 
             @mcs.anl.gov                                                  
             No Phone Info                                         Subject 
             Available                 Re: Multilevel solver               
                                                                           
                                                                           
             04/22/2008 10:08                                              
             PM                                                            
                                                                           
                                                                           
             Please respond to                                             
             petsc-users at mcs.a                                             
                  nl.gov                                                   
                                                                           
                                                                           




  Amit,

     Using a a PCSHELL should be fine (it can be used with GMRES),
my guess is there is a memory corruption error somewhere that is
causing the crash. This could be tracked down with www.valgrind.com

    Another way to you could implement this is with some very recent
additions I made to PCFIELDSPLIT that are in petsc-dev
(http://www-unix.mcs.anl.gov/petsc/petsc-as/developers/index.html)
With this you would chose
PCSetType(pc,PCFIELDSPLIT
PCFieldSplitSetIS(pc,is1
PCFieldSplitSetIS(pc,is2
PCFieldSplitSetType(pc,PC_COMPOSITE_SYMMETRIC_MULTIPLICATIVE
to use LU on A11 use the command line options
-fieldsplit_0_pc_type lu -fieldsplit_0_ksp_type preonly
and SOR on A22
-fieldsplit_1_pc_type sor -fieldsplit_1_ksp_type preonly -
fieldsplit_1_pc_sor_lits <lits> where
    <its> is the number of iterations you want to use block A22

is1 is the IS that contains the indices for all the vector entries in
the 1 block while is2 is all indices in the
vector for the 2 block. You can use ISCreateGeneral() to create these.

   Probably it is easiest just to try this out.

   Barry


On Apr 22, 2008, at 8:45 PM, Amit.Itagi at seagate.com wrote:

>
> Hi,
>
> I am trying to implement a multilevel method for an EM problem. The
> reference is : "Comparison of hierarchical basis functions for
> efficient
> multilevel solvers", P. Ingelstrom, V. Hill and R. Dyczij-Edlinger,
> IET
> Sci. Meas. Technol. 2007, 1(1), pp 48-52.
>
> Here is the summary:
>
> The matrix equation Ax=b is solved using GMRES with a multilevel
> pre-conditioner. A has a block structure.
>
> A11    A12       *         x1  =  b1
> A21    A22                  x2       b2
>
> A11 is mxm and A33 is nxn, where m is not equal to n.
>
> Step 1  :      Solve  A11 *  e1   = b1     (parallel LU using
> superLU or
> MUMPS)
>
> Step 2:        Solve   A22 * e2    =b2-A21*e1    (might either user
> a SOR
> solver or a parallel LU)
>
> Step 3:        Solve   A11* e1 = b1-A12*e2   (parallel LU)
>
> This gives the approximate solution to
>
> A11     A12     *      e1   =  b1
> A21     A22             e2       b2
>
> and is used as the pre-conditioner for the GMRES.
>
>
> Which PetSc method can implement this pre-conditioner ? I tried a
> PCSHELL
> type PC. With Hong's help, I also got the parallel LU to work
> withSuperLU/MUMPS. My program runs successfully on multiple
> processes on a
> single machine. But when I submit the program over multiple
> machines, I get
> a crash in the PCApply routine after several GMRES iterations. I
> think this
> has to do with using PCSHELL with GMRES (which is not a good idea). Is
> there a different way to implement this ? Does this resemble the usage
> pattern of one of the AMG preconditioners ?
>
>
> Thanks
>
> Rgds,
> Amit
>






More information about the petsc-users mailing list