[petsc-dev] bug in MatZeroRowsColumns for MPIAIJ

Matthew Knepley knepley at gmail.com
Fri Jan 17 10:57:54 CST 2014


On Fri, Jan 17, 2014 at 5:43 AM, Stephan Kramer <s.kramer at imperial.ac.uk>wrote:

>
>> Okay, I have checked in preliminary code. It passes the tests for Mat
>> ex18. I have merged it to next. Let me know how
>> it works for you.
>>
>>    Thanks,
>>
>>       Matt
>>
>>
> I've tried it out with various configurations of the test and it seems to
> work fine for me. I did first hit a segfault due to the uninitialised len
> variable, but I see that's fixed now. I've also tried it out in our own
> code (Fluidity) for what we actually want to use it for (getting rid of the
> ugly hack of setting bcs with big numbers on the diagonal) and it passes
> all tests. So thanks a lot for your effort: this is very useful for us.
>
> I think it might be worth it changing the blocksize bs in the example to
> something bigger than 1, so we test non-trivial block-sizes? Also the
> nonlocalBC variable is uninitialised. I actually found a bug in my test if
> you set nonlocalBC = PETSC_TRUE with rank>3. I've pasted the diff below
>

I have made these fixes, added more tests, and put it in the nightly builds.

  Thanks,

     Matt


> Cheers
> Stephan
>
> diff --git a/src/mat/examples/tests/ex18.c b/src/mat/examples/tests/ex18.c
> index 2d9ef25..40f3b9e 100644
> --- a/src/mat/examples/tests/ex18.c
> +++ b/src/mat/examples/tests/ex18.c
> @@ -9,12 +9,12 @@ int main(int argc,char **args)
>  {
>    Mat            A;
>    Vec            x, rhs, y;
> -  PetscInt       i,j,k,b,m = 3,n,nlocal=2,bs=1,Ii,J;
> +  PetscInt       i,j,k,b,m = 3,n,nlocal=2,bs=2,Ii,J;
>    PetscInt       *boundary_nodes, nboundary_nodes, *boundary_indices;
>    PetscMPIInt    rank,size;
>    PetscErrorCode ierr;
>    PetscScalar    v,v0,v1,v2,a0=0.1,a,rhsval, *boundary_values;
> -  PetscBool      upwind = PETSC_FALSE, nonlocalBC;
> +  PetscBool      upwind = PETSC_FALSE, nonlocalBC = PETSC_TRUE;
>
>    ierr = PetscInitialize(&argc,&args,(char*)0,help);CHKERRQ(ierr);
>    ierr = MPI_Comm_rank(PETSC_COMM_WORLD,&rank);CHKERRQ(ierr);
> @@ -80,11 +80,15 @@ int main(int argc,char **args)
>        ierr = PetscMalloc1(nboundary_nodes,&boundary_nodes);CHKERRQ(ierr);
>        k = 0;
>        for (i=size; i<m; i++,k++) {boundary_nodes[k] = n*i;};
> -    } else {
> -      nboundary_nodes = nlocal+1;
> +    } else if (rank<m) {
> +      nboundary_nodes = nlocal + 1;
>        ierr = PetscMalloc1(nboundary_nodes,&boundary_nodes);CHKERRQ(ierr);
>        boundary_nodes[0] = rank*n;
>        k = 1;
> +    } else {
> +      nboundary_nodes = nlocal;
> +      ierr = PetscMalloc1(nboundary_nodes,&boundary_nodes);CHKERRQ(ierr);
> +      k = 0;
>      };
>      for (j=nlocal*rank; j<nlocal*(rank+1); j++,k++) {boundary_nodes[k] =
> j;};
>    } else {
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140117/dd2d97de/attachment.html>


More information about the petsc-dev mailing list