[petsc-dev] [Fortran] MatLoad() error: MatLoad is not supported for type: mpiaij!

Shri abhyshr at mcs.anl.gov
Tue Sep 7 08:15:20 CDT 2010


I've pushed a fix. Please do hg pull.

----- Leo van Kampenhout  wrote:
> Ok I have done that, please have a look. It's a really weird problem. 

2010/9/6 Barry Smith <bsmith at mcs.anl.gov>

>
   This should not happen. Send the code to petsc-maint at mcs.anl.gov

>    Barry
On Sep 6, 2010, at 10:54 AM, Leo van Kampenhout wrote:
Hello Barry, Matt,
About this error I am having:


>[2]PETSC ERROR: Nonconforming object sizes!

>[2]PETSC ERROR: Mat mat,Vec y: local dim 700536 697068!


I am creating my RHS and initial vector from the same DA as the matrix is from (see below). I even tried with setting RHS==1 instead of loading from file, to no avail. I found out the error occurs when #PROC=3,5,7,...,8+ but NOT when #PROC=2,4,8. It might have something to do with differences in partitioning strategies of Mat and Vec? As you know my gridsize has a rather unusual value (number of gridpoints is 2101608). 

> 
>
Leo



      call DACreateGlobalVector(da,x,ierr)
      call VecDuplicate(x,b,ierr)

> 
>(...)
      call DAGetMatrix(da,MATMPIAIJ,A,ierr)

> 
>(...)
      call VecSet(b,one,ierr)

> 
>      call KSPCreate(MPI_COMM_WORLD,ksp,ierr)
      call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr)

> 
>(...)
      call KSPSolve(ksp,b,x,ierr)





>
> 2010/9/3 Barry Smith <bsmith at mcs.anl.gov>

> 
>
    You initial vectors must be created with the same DA that is associated with the Matrix for the layout to be automatically correct. Use DACreateGlobalVector() instead of VecCreate(). You may already be doing this and then the problem is more subtle. Let us know.
> 
>
   Barry
    Something may be going wrong with the copy header
On Sep 3, 2010, at 11:35 AM, Matthew Knepley wrote:

> 
> On Fri, Sep 3, 2010 at 5:50 PM, Leo van Kampenhout <lvankampenhout at gmail.com> wrote:

> 
> 
> Thanks. Following this conversation closely, i have already built petsc with --with-debugging=0 and found that then thing work better there. Until there is a ultimate fix I'm perfectly happy with that. 

Even on my cluster, I can now run the program for 2,4,8 cores. But there is always a catch... with 10 cores, GMRES breaks down! (see below) Sorry to bother you with this, it seems unrelated at first sight. Should I issue a new thread for this?

> 
> 
>
This is almost certainly somewhere you are not careful about constructinga Vec with the proper layout (as the message says). In fact, from the traceit looks like the rhs Vec passed in is not compatible with the matrix.
> 
> 
>
  Matt Regards, Leo

[5]PETSC ERROR: --------------------- Error Message ------------------------------------

> 
> 
> [5]PETSC ERROR: Nonconforming object sizes!
[5]PETSC ERROR: Mat mat,Vec y: local dim 210161 213282!

> [5]PETSC ERROR: ------------------------------------------------------------------------
[5]PETSC ERROR: Petsc Development HG revision: 6da7d8c5aaa81099861622f4e6bf1087652253a7  HG Date: Fri Sep 03 15:06:40 2010 +0200

> 
> 
> 
> [5]PETSC ERROR: See docs/changes/index.html for recent updates.
[5]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[5]PETSC ERROR: See docs/index.html for manual pages.
[5]PETSC ERROR: ------------------------------------------------------------------------

> 
> 
> 
> [5]PETSC ERROR: ./main on a linux-opt named node164 by s1459295    Fri Sep  3 17:44:26 2010
[5]PETSC ERROR: Libraries linked from /home/s1459295/install/petsc-dev/linux-optimized/lib
[5]PETSC ERROR: Configure run at Fri Sep  3 15:28:20 2010

> 
> 
> 
> [5]PETSC ERROR: Configure options --with-debugging=0
[5]PETSC ERROR: ------------------------------------------------------------------------
[5]PETSC ERROR: MatMult() line 1976 in src/mat/interface/matrix.c
[5]PETSC ERROR: KSPInitialResidual() line 55 in src/ksp/ksp/interface/itres.c

> 
> 
> 
> [5]PETSC ERROR: KSPSolve_GMRES() line 236 in src/ksp/ksp/impls/gmres/gmres.c
[5]PETSC ERROR: KSPSolve() line 427 in src/ksp/ksp/interface/itfunc.c





2010/9/3 Jed Brown <jed at 59a2.org>
> 
> 
>

>On Fri, 3 Sep 2010 09:15:09 -0600 (GMT-06:00), Shri <abhyshr at mcs.anl.gov> wrote:

> 
> 
> 
> 
> > Leo,

> >     Jed and I both tried running a MatLoad example (see src/dm/da/examples/tests/ex35.c) with petsc configured in debug mode and had issues running it with your grid size 202 X 102 X 102. This example runs in a reasonable amount of time with the optimized petsc build.

> 
> 
> 
> 
> > I suggest running your code on a smaller grid (100 X100 X 100 seems to work fine in the debug mode) for now, get it working and then using the optimized petsc build (--debugging=0) for the large grid size.

>

>I wrote an improved sort that doesn't have this bad behavior, but I'm

> trying to figure out why the old one is okay in optimized mode.  See

> here:

>

>   http://gist.github.com/563948

>

> The MEDIAN macro chooses the median of the first, last, and middle

> element in the segment.  Performance looks like

>

>   FAST -O1: 2.0 seconds

>   FAST -O0: 5 seconds

>   SLOW -O1: 2.3 seconds

>   SLOW -O0: more than an hour

>

> SLOW, which is basically the old implementation does a lot of shuffling

> with the array and ends up hitting a bad case for quicksort (even with

> the median choice).  What I don't understand, is that apparently

> building SLOW with -O1 transforms the algorithm into something

> asymptotically faster.  The assembly is a few hundred lines, and it's

> not obvious to me what this transform would be.

>

> Jed

>

>


-- 
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

> 
> 
> 
>


>



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20100907/1476b829/attachment.html>


More information about the petsc-dev mailing list