[petsc-users] block matrix in serial

Matthew Knepley knepley at gmail.com
Tue May 17 09:03:16 CDT 2016


On Sun, Apr 17, 2016 at 11:31 PM, Adrian Croucher <a.croucher at auckland.ac.nz
> wrote:

> hi
>
> Matt, did you (or anyone else) ever get a chance to look into these two bugs?
>
> AFAICT the blocksize discovery code hasn't been changed since then.
>
> I'm currently running a problem where the linear solver is struggling (running out of iterations),
> and I'm wondering if it might perform better if the matrix blocksize were being set correctly.
> May be nothing to do with it of course, but would at least be useful to eliminate that possibility.
>
>
I have finally pushed what I think are the right fixes into master. Can you
take a look and let me know
if it is fixed for you?

  Thanks,

     Matt


> Cheers, Adrian>>* On 16/09/15 12:40, Matthew Knepley wrote:
> *>* > On Tue, Sep 15, 2015 at 9:05 PM, Adrian Croucher
> *>* > <a.croucher at auckland.ac.nz <https://lists.mcs.anl.gov/mailman/listinfo/petsc-users> <mailto:a.croucher at auckland.ac.nz <https://lists.mcs.anl.gov/mailman/listinfo/petsc-users>>>
> *>* > wrote:
> *>* >
> *>* > hi
> *>* >
> *>* > I have a test code (attached) that sets up a finite volume mesh
> *>* > using DMPlex, with 2 degrees of freedom per cell.
> *>* >
> *>* > I then create a matrix using DMCreateMatrix(), having used
> *>* > DMSetMatType() to set the matrix type to MATBAIJ or MATMPIBAIJ, to
> *>* > take advantage of the block structure.
> *>* >
> *>* > This works OK and gives me the expected matrix structure when I
> *>* > run on > 1 processor, but neither MATBAIJ or MATMPIBAIJ works if I
> *>* > run it in serial:
> *>* >
> *>* >
> *>* > Plex should automatically discover the block size from the Section.
> *>* > If not, it uses block size 1. I have to look at the example to see
> *>* > why the discovery is not working. Do you have any constraints?
> *>
> >* It looks like block discovery in parallel effectively always
> *>* determines a block size of 1.  Running with -mat_view ::ascii_info gives:
> *>>* Mat Object: 2 MPI processes
> *>*   type: mpibaij
> *>*   rows=20, cols=20
> *>*   total: nonzeros=112, allocated nonzeros=112
> *>*   total number of mallocs used during MatSetValues calls =0
> *>*       block size is 1
> *>*                     ^^^
> *>>* The block size discovery does this:
> *>>*         for (p = pStart; p < pEnd; ++p) {
> *>*           ierr = PetscSectionGetDof(sectionGlobal, p, &dof);CHKERRQ(ierr);
> *>*           ierr = PetscSectionGetConstraintDof(sectionGlobal, p,
> *>* &cdof);CHKERRQ(ierr);
> *>*           if (dof-cdof) {
> *>*             if (bs < 0) {
> *>*               bs = dof-cdof;
> *>*             } else if (bs != dof-cdof) {
> *>*               /* Layout does not admit a pointwise block size */
> *>*               bs = 1;
> *>*               break;
> *>*             }
> *>*           }
> *>*         }
> *>>* In parallel, some of the dofs are remote (encoded as negative).  So we
> *>* run through seeing (dof - cdof) == 2, until we hit a "remote" point at
> *>* when we see (dof - cdof) != 2 and then we break out and set bs = 1.
> *>>* In serial, we correctly determine bs == 2.  But then
> *>* DMGetLocalToGlobalMapping always does
> *>>*       ierr = ISLocalToGlobalMappingCreate(PETSC_COMM_SELF, 1,size,
> *>* ltog, PETSC_OWN_POINTER, &dm->ltogmap);CHKERRQ(ierr);
> *>>>* i.e. is set with block size == 1.
> *>>* So there are two bugs here.
> *>>* 1. In parallel, block size discovery in Mat creation has gone wrong
> *>
> Crap. Hoist on my own petard. Okay I will fix this.
>
>
> >* 2. (Always), the lgmap has block size of 1, irrespective of the
> *>* discovered block size from the section.
> *>
> Yep. This can also be fixed. It should work regardless, but would be better
> with blocking.
>
>   Thanks
>
>      Matt
>
> --
> Dr Adrian Croucher
> Senior Research Fellow
> Department of Engineering Science
> University of Auckland, New Zealand
> email: a.croucher at auckland.ac.nz
> tel: +64 (0)9 923 84611
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160517/9a14601f/attachment.html>


More information about the petsc-users mailing list