[petsc-dev] [Fortran] MatLoad() error: MatLoad is not supported for type: mpiaij!

Matthew Knepley knepley at gmail.com
Fri Sep 3 11:35:55 CDT 2010


On Fri, Sep 3, 2010 at 5:50 PM, Leo van Kampenhout <lvankampenhout at gmail.com
> wrote:

> Thanks. Following this conversation closely, i have already built petsc
> with --with-debugging=0 and found that then thing work better there. Until
> there is a ultimate fix I'm perfectly happy with that.
>
> Even on my cluster, I can now run the program for 2,4,8 cores. But there is
> always a catch... with 10 cores, GMRES breaks down! (see below) Sorry to
> bother you with this, it seems unrelated at first sight. Should I issue a
> new thread for this?
>

This is almost certainly somewhere you are not careful about constructing
a Vec with the proper layout (as the message says). In fact, from the trace
it looks like the rhs Vec passed in is not compatible with the matrix.

  Matt


> Regards, Leo
>
> [5]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [5]PETSC ERROR: Nonconforming object sizes!
> [5]PETSC ERROR: Mat mat,Vec y: local dim 210161 213282!
> [5]PETSC ERROR:
> ------------------------------------------------------------------------
> [5]PETSC ERROR: Petsc Development HG revision:
> 6da7d8c5aaa81099861622f4e6bf1087652253a7  HG Date: Fri Sep 03 15:06:40 2010
> +0200
> [5]PETSC ERROR: See docs/changes/index.html for recent updates.
> [5]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [5]PETSC ERROR: See docs/index.html for manual pages.
> [5]PETSC ERROR:
> ------------------------------------------------------------------------
> [5]PETSC ERROR: ./main on a linux-opt named node164 by s1459295    Fri Sep
> 3 17:44:26 2010
> [5]PETSC ERROR: Libraries linked from
> /home/s1459295/install/petsc-dev/linux-optimized/lib
> [5]PETSC ERROR: Configure run at Fri Sep  3 15:28:20 2010
> [5]PETSC ERROR: Configure options --with-debugging=0
> [5]PETSC ERROR:
> ------------------------------------------------------------------------
> [5]PETSC ERROR: MatMult() line 1976 in src/mat/interface/matrix.c
> [5]PETSC ERROR: KSPInitialResidual() line 55 in
> src/ksp/ksp/interface/itres.c
> [5]PETSC ERROR: KSPSolve_GMRES() line 236 in
> src/ksp/ksp/impls/gmres/gmres.c
> [5]PETSC ERROR: KSPSolve() line 427 in src/ksp/ksp/interface/itfunc.c
>
>
>
>
>
> 2010/9/3 Jed Brown <jed at 59a2.org>
>
> On Fri, 3 Sep 2010 09:15:09 -0600 (GMT-06:00), Shri <abhyshr at mcs.anl.gov>
>> wrote:
>> > Leo,
>> >     Jed and I both tried running a MatLoad example (see
>> src/dm/da/examples/tests/ex35.c) with petsc configured in debug mode and had
>> issues running it with your grid size 202 X 102 X 102. This example runs in
>> a reasonable amount of time with the optimized petsc build.
>> > I suggest running your code on a smaller grid (100 X100 X 100 seems to
>> work fine in the debug mode) for now, get it working and then using the
>> optimized petsc build (--debugging=0) for the large grid size.
>>
>> I wrote an improved sort that doesn't have this bad behavior, but I'm
>> trying to figure out why the old one is okay in optimized mode.  See
>> here:
>>
>>  http://gist.github.com/563948
>>
>> The MEDIAN macro chooses the median of the first, last, and middle
>> element in the segment.  Performance looks like
>>
>>  FAST -O1: 2.0 seconds
>>  FAST -O0: 5 seconds
>>  SLOW -O1: 2.3 seconds
>>  SLOW -O0: more than an hour
>>
>> SLOW, which is basically the old implementation does a lot of shuffling
>> with the array and ends up hitting a bad case for quicksort (even with
>> the median choice).  What I don't understand, is that apparently
>> building SLOW with -O1 transforms the algorithm into something
>> asymptotically faster.  The assembly is a few hundred lines, and it's
>> not obvious to me what this transform would be.
>>
>> Jed
>>
>
>


-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20100903/cb20dbd8/attachment.html>


More information about the petsc-dev mailing list