Preallocation for external direct solvers

Paul T. Bauman pbauman at ices.utexas.edu
Sat Apr 14 15:52:34 CDT 2007


Hello,

I'm developing some code and am using direct solvers (for the moment), 
in particular MUMPS and SuperLU. I noticed that PETSc is not 
preallocating the memory for it even though I specify maximum number of 
entries per row.

       call 
MatCreateSeqAIJ(PETSC_COMM_WORLD,total_num_dofs,total_num_dofs, &
            max_number_nonzeros_row,PETSC_NULL_INTEGER, J ,ierr)

       call MatSetFromOptions(J,ierr)

       call SNESSetJacobian(snes,J,J,FormJacobian,PETSC_NULL_OBJECT,ierr)

When I print -info:

[0] User provided function(): (Fortran):PETSc successfully started: procs 1
[0] User provided function(): Running on machine: dhcp-67-30.ices.utexas.edu
[0] PetscCommDuplicate(): Duplicating a communicator 1140850688 
-2080374784 max tags = 2147483647
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 
-2080374784
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 
-2080374784
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 
-2080374784
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 
-2080374784
[0] MatConvert_AIJ_AIJMUMPS(): Using MUMPS for LU factorization and solves.
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 
-2080374784
  0 SNES Function norm 8.500000000000e-01
[0] MatSetUpPreallocation(): Warning not preallocating matrix storage

So, of course, this really slows down the code for even small problems 
(20000 dof).  Same story for SuperLU. Should I use a different call to 
do the preallocation?

Thanks for any suggestions.

Best,

Paul




More information about the petsc-users mailing list