[petsc-users] empty split for fieldsplit

Hong hzhang at mcs.anl.gov
Sun Jun 18 17:45:59 CDT 2017


Hoang,
I pushed a fix
https://bitbucket.org/petsc/petsc/commits/d4e3277789d24018f0db1641a80db7be76600165
and added your test to
petsc/src/ksp/ksp/examples/tests/ex53.c

It is on the branch hzhang/fix-blockedIS-submat
Let me know if it still does not fix your problem.

Hong

On Sat, Jun 17, 2017 at 4:06 PM, Zhang, Hong <hzhang at mcs.anl.gov> wrote:

> never mind, I know it is ok to set blocksize=2
> ------------------------------
> *From:* Zhang, Hong
> *Sent:* Saturday, June 17, 2017 3:56:35 PM
>
> *To:* Smith, Barry F.; Hoang Giang Bui
> *Cc:* petsc-users
> *Subject:* Re: [petsc-users] empty split for fieldsplit
>
>
> Matrix A is a tridiagonal matrix with blocksize=1.
>
> Why do you set block_size=2 for A_IS and B_IS?
>
>
> Hong
> ------------------------------
> *From:* Zhang, Hong
> *Sent:* Friday, June 16, 2017 7:55:45 AM
> *To:* Smith, Barry F.; Hoang Giang Bui
> *Cc:* petsc-users
> *Subject:* Re: [petsc-users] empty split for fieldsplit
>
>
> I'm in Boulder and will be back home this evening.
>
> Will test it this weekend.
>
>
> Hong
> ------------------------------
> *From:* Smith, Barry F.
> *Sent:* Thursday, June 15, 2017 1:38:11 PM
> *To:* Hoang Giang Bui; Zhang, Hong
> *Cc:* petsc-users
> *Subject:* Re: [petsc-users] empty split for fieldsplit
>
>
>    Hong,
>
> Please build the attached code with master and run with
>
> petscmpiexec -n 2 ./ex1 -mat_size 40 -block_size 2 -method 2
>
> I think this is a bug in your new MatGetSubMatrix routines. You take the
> block size of the outer IS and pass it into the inner IS but that inner IS
> may not support the same block size hence the crash.
>
>    Can you please debug this?
>
>     Thanks
>
>      Barry
>
>
>
> > On Jun 15, 2017, at 7:56 AM, Hoang Giang Bui <hgbk2008 at gmail.com> wrote:
> >
> > Hi Barry
> >
> > Thanks for pointing out the error. I think the problem coming from the
> zero fieldsplit in proc 0. In this modified example, I parameterized the
> matrix size and block size, so when you're executing
> >
> > mpirun -np 2 ./ex -mat_size 40 -block_size 2 -method 1
> >
> > everything was fine. With method = 1, fieldsplit size of B is nonzero
> and is divided by the block size.
> >
> > With method=2, i.e mpirun -np 2 ./ex -mat_size 40 -block_size 2 -method
> 2, the fieldsplit B is zero on proc 0, and the error is thrown
> >
> > [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [1]PETSC ERROR: Arguments are incompatible
> > [1]PETSC ERROR: Local size 11 not compatible with block size 2
> >
> > This is somehow not logical, because 0 is divided by block_size.
> >
> > Furthermore, if you execute  "mpirun -np 2 ./ex -mat_size 20 -block_size
> 2 -method 2", the code hangs at ISSetBlockSize, which is pretty similar to
> my original problem. Probably the original one also hangs at
> ISSetBlockSize, which I may not realize at that time.
> >
> > Giang
> >
> > On Wed, Jun 14, 2017 at 5:29 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> > You can't do this
> >
> >    ierr = MatSetSizes(A,PETSC_DECIDE,N,N,N);CHKERRQ(ierr);
> >
> >   use PETSC_DECIDE for the third argument
> >
> > Also this is wrong
> >
> >   for (i = Istart; i < Iend; ++i)
> >    {
> >        ierr = MatSetValue(A,i,i,2,INSERT_VALUES);CHKERRQ(ierr);
> >        ierr = MatSetValue(A,i+1,i,-1,INSERT_VALUES);CHKERRQ(ierr);
> >        ierr = MatSetValue(A,i,i+1,-1,INSERT_VALUES);CHKERRQ(ierr);
> >    }
> >
> > you will get
> >
> > $ petscmpiexec -n 2 ./ex1
> > 0: Istart = 0, Iend = 60
> > 1: Istart = 60, Iend = 120
> > [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [1]PETSC ERROR: Argument out of range
> > [1]PETSC ERROR: Row too large: row 120 max 119
> > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [1]PETSC ERROR: Petsc Development GIT revision: v3.7.6-4103-g93161b8192
> GIT Date: 2017-06-11 14:49:39 -0500
> > [1]PETSC ERROR: ./ex1 on a arch-basic named Barrys-MacBook-Pro.local by
> barrysmith Wed Jun 14 18:26:52 2017
> > [1]PETSC ERROR: Configure options PETSC_ARCH=arch-basic
> > [1]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 550 in
> /Users/barrysmith/Src/petsc/src/mat/impls/aij/mpi/mpiaij.c
> > [1]PETSC ERROR: #2 MatSetValues() line 1270 in
> /Users/barrysmith/Src/petsc/src/mat/interface/matrix.c
> > [1]PETSC ERROR: #3 main() line 30 in /Users/barrysmith/Src/petsc/
> test-dir/ex1.c
> > [1]PETSC ERROR: PETSc Option Table entries:
> > [1]PETSC ERROR: -malloc_test
> >
> > You need to get the example working so it ends with the error you
> reported previously not these other bugs.
> >
> >
> > > On Jun 12, 2017, at 10:19 AM, Hoang Giang Bui <hgbk2008 at gmail.com>
> wrote:
> > >
> > > Dear Barry
> > >
> > > I made a small example with 2 process with one empty split in proc 0.
> But it gives another strange error
> > >
> > > [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > [1]PETSC ERROR: Arguments are incompatible
> > > [1]PETSC ERROR: Local size 31 not compatible with block size 2
> > >
> > > The local size is always 60, so this is confusing.
> > >
> > > Giang
> > >
> > > On Sun, Jun 11, 2017 at 8:11 PM, Barry Smith <bsmith at mcs.anl.gov>
> wrote:
> > >   Could be, send us a simple example that demonstrates the problem and
> we'll track it down.
> > >
> > >
> > > > On Jun 11, 2017, at 12:34 PM, Hoang Giang Bui <hgbk2008 at gmail.com>
> wrote:
> > > >
> > > > Hello
> > > >
> > > > I noticed that my code stopped very long, possibly hang, at
> PCFieldSplitSetIS. There are two splits and one split is empty in one
> process. May that be the possible reason that PCFieldSplitSetIS hang ?
> > > >
> > > > Giang
> > >
> > >
> > > <ex.c>
> >
> >
> > <ex.c>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170618/c7c0e782/attachment.html>


More information about the petsc-users mailing list