SV: Error codes from external packages

Lars Rindorf Lars.Rindorf at teknologisk.dk
Wed Jun 4 07:39:17 CDT 2008


Hi David 

Thanks for the answer and the reference I'll look into it.

I have since my email tried mumps. It is considerably faster than umfpack when running real scalars and default solver settings (43/53 secs), which I had not expected. I'll make some more detailed comparisons of the performance of mumps and umfpack on the xeon quad core after the summer vacation.

KR, Lars  

-----Oprindelig meddelelse-----
Fra: owner-petsc-users at mcs.anl.gov [mailto:owner-petsc-users at mcs.anl.gov] På vegne af David Colignon
Sendt: 3. juni 2008 17:51
Til: petsc-users at mcs.anl.gov
Emne: Re: Error codes from external packages

Hi,

my colleague Ch. Geuzaine added complex arithmetic support and 64 bit addressing for umfpack last year.

See http://www-unix.mcs.anl.gov/web-mail-archive/lists/petsc-dev/2007/06/msg00000.html

It has been included in petsc-dev.

Cheers,

Dave

--
David Colignon, Ph.D.
Collaborateur Logistique F.R.S.-FNRS (Equipements de Calcul Intensif) ACE - Applied & Computational Electromagnetics Institut Montefiore B28 Université de Liège 4000 Liège - BELGIQUE
Tél: +32 (0)4 366 37 32
Fax: +32 (0)4 366 29 10
WWW:    http://www.montefiore.ulg.ac.be/personnel.php?op=detail&id=898
Agenda: http://www.google.com/calendar/embed?src=david.colignon%40gmail.com


Matthew Knepley wrote:
> On Tue, Jun 3, 2008 at 6:53 AM, Lars Rindorf 
> <Lars.Rindorf at teknologisk.dk> wrote:
>> Hi Matthew
>>
>> I've a couple of questions regarding umfpack and petsc. I have a version of the program that I use. This program uses complex scalars and umfpack in petsc, but when I try myself to compile petsc with umfpack and complex numbers, petsc gives an error saying that umfpack and complex scalars is not yet implemented. Is that correct? Is there a version of petsc that allows complex scalars and umfpack?
> 
> This question came up before on this list. We do not support complex 
> with UMFPACK. I cannot remember the reason, but there was a problem 
> with the complex extension.
> 
>> Secondly, I'm still having problems with umfpack running out of memory (umfpack error -1). I have played around with petsc memory allocation in MatSetAIJSetPreallocation and with umfpack Control[UMFPACK_ALLOC_INIT] and it makes no difference. I have try it also with both Intel mkl blas and petsc default blas, and that made no difference either. Do you have any ideas where to look for the error?
> 
> Are you trying to go beyond 32-bits? If so, you would need an OS that 
> will allocate more than 2G to a process, like 64-bit Linux.
> 
>   Matt
> 
>> The largest succesful simulation gives the following petsc output:
>> KSP Object:
>>  type: preonly
>>  maximum iterations=10000, initial guess is zero
>>  tolerances:  relative=1e-05, absolute=1e-50, divergence=10000  left 
>> preconditioning PC Object:
>>  type: lu
>>    LU: out-of-place factorization
>>      matrix ordering: nd
>>    LU: tolerance for zero pivot 1e-12
>>    LU: factor fill ratio needed 0
>>         Factored matrix follows
>>        Matrix Object:
>>          type=umfpack, rows=118636, cols=118636
>>          total: nonzeros=0, allocated nonzeros=118636
>>            not using I-node routines
>>            UMFPACK run parameters:
>>              Control[UMFPACK_PRL]: 1
>>              Control[UMFPACK_STRATEGY]: 0
>>              Control[UMFPACK_DENSE_COL]: 0.2
>>              Control[UMFPACK_DENSE_ROW]: 0.2
>>              Control[UMFPACK_AMD_DENSE]: 10
>>              Control[UMFPACK_BLOCK_SIZE]: 32
>>              Control[UMFPACK_2BY2_TOLERANCE]: 0.01
>>              Control[UMFPACK_FIXQ]: 0
>>              Control[UMFPACK_AGGRESSIVE]: 1
>>              Control[UMFPACK_PIVOT_TOLERANCE]: 0.1
>>              Control[UMFPACK_SYM_PIVOT_TOLERANCE]: 0.001
>>              Control[UMFPACK_SCALE]: 1
>>              Control[UMFPACK_ALLOC_INIT]: 0.7
>>              Control[UMFPACK_DROPTOL]: 0
>>              Control[UMFPACK_IRSTEP]: 0
>>              UMFPACK default matrix ordering is used (not the PETSc 
>> matrix ordering)  linear system matrix = precond matrix:
>>  Matrix Object:
>>    type=umfpack, rows=118636, cols=118636
>>    total: nonzeros=4377120, allocated nonzeros=29659000
>>      using I-node routines: found 105980 nodes, limit used is 5
>>
>> KR, Lars
>>
>> -----Oprindelig meddelelse-----
>> Fra: owner-petsc-users at mcs.anl.gov 
>> [mailto:owner-petsc-users at mcs.anl.gov] På vegne af Matthew Knepley
>> Sendt: 2. juni 2008 16:33
>> Til: petsc-users at mcs.anl.gov
>> Emne: Re: Error codes from external packages
>>
>> On Mon, Jun 2, 2008 at 9:22 AM, Lars Rindorf <Lars.Rindorf at teknologisk.dk> wrote:
>>> Hi Matthew
>>>
>>> I have included the -mat_umfpack_prl parameter, but it does not make any difference. I have checked the spelling with petsc manual. When umfpack crashes it (umfpack_di_numeric) returns an error code. I want to access that code.
>> Ah, I was checking solve(), not factor(). This is an oversight in the code. I am fixing it in the dev version. We will have a release fairly soon, but you can always get dev for these kinds of bugs fixes quickly.
>>
>>   Matt
>>
>>> Here is what petsc returns (with/without -mat_umfpack_prl option):
>>> [lhr at localhost notch_patch]$ getdp patch_notch.pro -msh 
>>> patch_notch_29000.msh -pre res_static -cal  -ksp_type preonly -pc_type lu -mat_type umfpack -mat_umfpack_prl P r e - P r o c e s s i n g . . .
>>> E n d   P r e - P r o c e s s i n g
>>> P r o c e s s i n g . . .
>>> Operation : Generate[A]
>>> Info      : Setting System {A,b} to zero
>>> Resources : cpu 6.41402 s
>>> Operation : Solve[A]
>>> PETSc     : N: 136188
>>> [0]PETSC ERROR: --------------------- Error Message
>>> ------------------------------------
>>> [0]PETSC ERROR: Error in external library!
>>> [0]PETSC ERROR: umfpack_di_numeric failed!
>>> [0]PETSC ERROR:
>>> --------------------------------------------------------------------
>>> --
>>> -- [0]PETSC ERROR: Petsc Release Version 2.3.3, Patch 13, Thu May 15
>>> 17:29:26 CDT 2008 HG revision:
>>> 4466c6289a0922df26e20626fd4a0b4dd03c8124
>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>> [0]PETSC ERROR: See docs/index.html for manual pages.
>>> [0]PETSC ERROR:
>>> --------------------------------------------------------------------
>>> --
>>> -- [0]PETSC ERROR: patch_notch.pro on a linux-gnu named 
>>> localhost.localdomain by lhr Mon Jun  2 16:03:50 2008 [0]PETSC ERROR:
>>> Libraries linked from
>>> /home/lhr/Desktop/getdp/petsc/petsc-2.3.3-p13/lib/linux-gnu-c-debug
>>> [0]PETSC ERROR: Configure run at Mon Jun  2 15:58:42 2008 [0]PETSC
>>> ERROR: Configure options --with-cc=gcc --with-fc=g77 
>>> --with-blas-lapack-dir=/opt/intel/mkl/10.0.1.014/lib/em64t/
>>> --download-mpich=ifneeded --download-umfpack=ifneeded 
>>> --with-shared=0 [0]PETSC ERROR:
>>> --------------------------------------------------------------------
>>> --
>>> -- [0]PETSC ERROR: MatLUFactorNumeric_UMFPACK() line 129 in 
>>> src/mat/impls/aij/seq/umfpack/umfpack.c
>>> [0]PETSC ERROR: MatLUFactorNumeric() line 2227 in 
>>> src/mat/interface/matrix.c [0]PETSC ERROR: PCSetUp_LU() line 280 in 
>>> src/ksp/pc/impls/factor/lu/lu.c [0]PETSC ERROR: PCSetUp() line 787 
>>> in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSPSetUp() line 234 
>>> in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: KSPSolve() line 
>>> 347 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: User provided
>>> function() line 1472 in unknowndirectory/LinAlg_PETSC.c
>>> [unset]: aborting job:
>>> application called MPI_Abort(MPI_COMM_WORLD, 76) - process 0
>>>
>>> KR, Lars
>>>
>>>
>>> -----Oprindelig meddelelse-----
>>> Fra: owner-petsc-users at mcs.anl.gov
>>> [mailto:owner-petsc-users at mcs.anl.gov] På vegne af Matthew Knepley
>>> Sendt: 2. juni 2008 15:07
>>> Til: petsc-users at mcs.anl.gov
>>> Emne: Re: Error codes from external packages
>>>
>>> On Mon, Jun 2, 2008 at 7:17 AM, Lars Rindorf <Lars.Rindorf at teknologisk.dk> wrote:
>>>> Dear all
>>>>
>>>> If I want to know the error codes of an external package that 
>>>> crashes, then what can I do?
>>> We call
>>>
>>>  umfpack_zl_report_status()
>>>
>>> with the UMFPACK status code in the event of a failure. Printing for this is controlled by the -mat_umfpack_prl option I believe.
>>>
>>>   Matt
>>>
>>>> The problem arises with umfpack when the size of the matrix is more 
>>>> than 118000x118000, corresponding to 2.4 Gb memory consumption. It 
>>>> simply returns "umfpack_di_numeric" (factorization of a real matrix) failed.
>>>> Has anybody else experienced this?
>>>>
>>>> KR, Lars
>>>>
>>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>> -- Norbert Wiener
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>> -- Norbert Wiener
>>
>>
> 
> 
> 




More information about the petsc-users mailing list