[petsc-dev] [Fortran] MatLoad() error: MatLoad is not supported for type: mpiaij!

Barry Smith bsmith at mcs.anl.gov
Thu Sep 2 12:56:39 CDT 2010


  Most of the time is in the MatLoad_MPIAIJ() (you are running in parallel right) so you could compile and run with gprof to see where in the load is most of the time. It is probably in disk reads and MPI sends.
But the repartioning for the DA is not taking much of the time.

   Barry

On Sep 2, 2010, at 12:46 PM, Shri wrote:

> 
> > That code cannot be right. When you copy the App to A it does not have the same nonzero pattern. You should not use a MatCopy() there you should use a MatHeaderMerge() or a MatHeaderReplace() whichever one is right.
> 
> Thanks,i was searching for a similar function but did not find it hence used MatCopy. I've fixed this and pushed it.
> 
>  > Then try running with a big DA to see how long it takes.
>    
> I tried MatLoad on Leo's 3D grid (202 X 102 X 102) but it takes forever to load. Hence, I tried it on a relatively smaller grid (100 X 100 X 100) and here are the log summary numbers that i got.
> 
> Event                Count      Time (sec)     Flops                             --- Global ---  --- Stage ---   Total
>                    Max Ratio  Max     Ratio   Max  Ratio  Mess   Avg len Reduct  %T %F %M %L %R  %T %F %M %L %R Mflop/s
> ------------------------------------------------------------------------------------------------------------------------
> 
> --- Event Stage 0: Main Stage
> 
> MatAssemblyBegin       4 1.0 2.6508e-0134.6 0.00e+00 0.0 0.0e+00 0.0e+00 6.0e+00  3  0  0  0  5   3  0  0  0  6     0
> MatAssemblyEnd         4 1.0 4.2604e-01 1.0 0.00e+00 0.0 1.2e+01 2.0e+04 3.6e+01  9  0 63  1 28   9  0 63  1 38     0
> MatGetSubMatrice       1 1.0 5.9222e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.0e+00 13  0  0  0  4  13  0  0  0  5     0
> MatLoad                       1 1.0 3.2231e+00 1.0 0.00e+00 0.0 1.1e+01 4.0e+06 4.9e+01 70  0 58100 38  70  0 58100 52     0
> ------------------------------------------------------------------------------------------------------------------------
> 
> 
> Shri
>  
>    Barry
> 
> On Sep 2, 2010, at 10:17 AM, Leo van Kampenhout wrote:
> 
> Hello Shri,
> 
> thanks for the fix. The error is now gone but the loading of the Mat takes now infinitely long.. help!!
> 
> Leo
> 
> 
> 2010/9/2 Shri <abhyshr at mcs.anl.gov>
> 
> >
> Pushed. Please let me know if you get any errors.
> Shri
> 
> >
> 
> 
> ----- Barry Smith  wrote:
> >
>   if (size > 1) { 
>     /* change viewer to display matrix in natural ordering */
>     ierr = MatShellSetOperation(A, MATOP_VIEW, (void (*)(void)) MatView_MPI_DA);CHKERRQ(ierr);
> 
> >
>     /* turn off loading of matrix because loading would require proper permutation I don't feel like writing now */
>     ierr = MatShellSetOperation(A, MATOP_LOAD, (void (*)(void)) 0);CHKERRQ(ierr);
> 
> >
>   }
> 
> 
>   Shri,
> 
>     Could you please add the support for this? And push to petsc-dev
> 
>     Thanks
> 
>      Barry
> 
> 
> On Sep 1, 2010, at 10:23 AM, Leo van Kampenhout wrote:
> 
> >
> 
> Using a DA, of equal size and type as the other one. The curious thing is that there are no problems on a single core run. (but that is not what I want ;)
> 
> 
> 
> 2010/9/1 Barry Smith <bsmith at mcs.anl.gov>
> 
> >
> >
> 
>    Did you get the matrix in the writer code using a DA or some other code that puts the vector from the 2d or 3d problem in the natural ordering on the binary file.
> 
> >
> >
>    Barry
> 
> On Sep 1, 2010, at 9:55 AM, Leo van Kampenhout wrote:
> 
> Hi all, 
> 
> I'm having trouble with MatLoad on Petsc-Dev. Not sure if this problem also exists in the main PETSc 3.1 since my code won't compile on that. 
> 
> >
> > The problematic calls are as follows: 
> 
>       call PetscViewerBinaryOpen(PETSC_COMM_WORLD,file2,
> 
> >     &     FILE_MODE_READ,fd,ierr)
> 
> >      call DAGetMatrix(da,MATMPIAIJ,A,ierr)
> 
> > 
> >      call MatLoad(A,fd,ierr)
> 
> >
> which supposedly will load a matrix A which is stored by the following commands:
> 
> > 
> >
>       call PetscViewerBinaryOpen(PETSC_COMM_WORLD,file2,
> 
> >     &     FILE_MODE_WRITE,fd,ierr)
> 
> > 
> >      call MatView(A,fd,ierr)
> 
> by another program (on a single core). The MatLoad() does work perfectly on a single core, however two or more cores make the program crash with the following error (see below)
> 
> >
> > 
> > The programcall is 
>  mpirun -n 2 ./main
> 
> Is something broken with Petsc or do I need to set some extra parameters? 
> 
> >
> > 
> > Thanks in advance, 
> 
> Leo van Kampenhout
> 
> PS. I'm using a build from last friday, since when I updated petsc-dev today using Mercurial I couldnt get it to configure or compile :S (which is another subject)
> 
> >
> > 
> >
> 
> [0]PETSC ERROR: --------------------- Error Message ------------------------------------
> [0]PETSC ERROR: No support for this operation for this object type!
> [0]PETSC ERROR: MatLoad is not supported for type: mpiaij!
> 
> >
> > 
> > [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Development HG revision: 84fec3ca75412473b06251029b58f2a68f1334f3  HG Date: Fri Aug 27 07:06:08 2010 +0200
> 
> >
> > 
> > [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR: ------------------------------------------------------------------------
> 
> >
> > 
> > [0]PETSC ERROR: ./main on a arch-linu named wingtip72 by csg4035 Wed Sep  1 15:52:11 2010
> [0]PETSC ERROR: Libraries linked from /net/users/csg/csg4035/install/petsc-dev/arch-linux-gnu-c-debug/lib
> [0]PETSC ERROR: Configure run at Fri Aug 27 08:37:32 2010
> 
> >
> > 
> > [0]PETSC ERROR: Configure options 
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: MatLoad() line 843 in src/mat/interface/matrix.c
> 
> 
> 
> >
> >
> 
> 
> 
> >
> 
> 
> 
> 
> >
> 
> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20100902/dea7f792/attachment.html>


More information about the petsc-dev mailing list