[petsc-users] MUMPS error and superLU error

venkatesh g venkateshgk.j at gmail.com
Fri May 29 10:21:56 CDT 2015


Thanks. So I what other direct solver I can use for this singular B ? QR is
a good option but what to do ?

I will read the manual on dumping out null space.

Venkatesh

On Fri, May 29, 2015 at 7:42 PM, Hong <hzhang at mcs.anl.gov> wrote:

> venkatesh:
>
>> On Tue, May 26, 2015 at 9:02 PM, Hong <hzhang at mcs.anl.gov> wrote:
>>
>>> 'A serial job in MATLAB for the same matrices takes < 60GB. '
>>> Can you run this case in serial? If so, try petsc, superlu or mumps to
>>> make sure the matrix is non-singular.
>>>
>> B matrix is singular but I get my result in Petsc and Mumps for small
>> matrices.
>>
>
> You are lucky to get something out of a singular matrix using LU
> factorization, likely due to arithmetic roundoff. Is the obtained solution
> useful?
>
> Suggest reading petsc or slepc manual on how to dump out null space when a
> matrix is singular.
>
> Hong
>
>>
>>> Both mumps and superlu_dist show crash in MatFactorNumeric(). Mumps
>>> gives error
>>> [16]PETSC ERROR: Error reported by MUMPS in numerical factorization
>>> phase: Cannot allocate required memory 65710 megabytes.
>>>
>>> Does your code work for smaller problems?
>>>
>> Yes code works for small problems
>>
>>> Try using more processors?
>>>
>>> Why use such huge '-mat_mumps_icntl_14 200000' (percentage of estimated
>>> workspace increase)? The default is 20. Try 40?
>>>
>>> Superlu_dist usually uses less memory than mumps, but it also crashes. I
>>> guess something wrong with your matrix. Is it singular?
>>>
>> The B matrix is singular. So Super Lu cant be used is it ?
>>
>>
>>> Run superlu_dist on a slightly smaller matrix with option
>>> '-mat_superlu_dist_statprint' which displays memory usage info., e.g.,
>>>
>> Ok I will do that and check.
>>
>>> petsc/src/ksp/ksp/examples/tutorials (maint)
>>> $ mpiexec -n 2 ./ex2 -pc_type lu -pc_factor_mat_solver_package
>>> superlu_dist -mat_superlu_dist_statprint
>>> Nonzeros in L       560
>>> Nonzeros in U       560
>>> nonzeros in L+U     1064
>>> nonzeros in LSUB    248
>>> NUMfact space (MB) sum(procs):  L\U 0.01 all 0.05
>>> Total highmark (MB):  All 0.05 Avg 0.02 Max 0.02
>>> EQUIL time             0.00
>>> ROWPERM time           0.00
>>> COLPERM time           0.00
>>> SYMBFACT time          0.00
>>> DISTRIBUTE time        0.00
>>> FACTOR time            0.00
>>> Factor flops 1.181000e+04 Mflops     4.80
>>> SOLVE time             0.00
>>> SOLVE time             0.00
>>> Solve flops 2.194000e+03 Mflops     4.43
>>> SOLVE time             0.00
>>> Solve flops 2.194000e+03 Mflops     5.14
>>> Norm of error 1.18018e-15 iterations 1
>>>
>>> Hong
>>>
>>>
>>> On Tue, May 26, 2015 at 9:03 AM, venkatesh g <venkateshgk.j at gmail.com>
>>> wrote:
>>>
>>>> I posted a while ago in MUMPS forums but no one seems to reply.
>>>>
>>>> I am solving a large generalized Eigenvalue problem.
>>>>
>>>> I am getting the following error which is attached, after giving the
>>>> command:
>>>>
>>>> /cluster/share/venkatesh/petsc-3.5.3/linux-gnu/bin/mpiexec -np 64
>>>> -hosts compute-0-4,compute-0-6,compute-0-7,compute-0-8 ./ex7 -f1 a72t -f2
>>>> b72t -st_type sinvert -eps_nev 3 -eps_target 0.5 -st_ksp_type preonly
>>>> -st_pc_type lu -st_pc_factor_mat_solver_package mumps -mat_mumps_icntl_14
>>>> 200000
>>>>
>>>> IT IS impossible to allocate so much memory per processor.. it is
>>>> asking like around 70 GB per processor.
>>>>
>>>> A serial job in MATLAB for the same matrices takes < 60GB.
>>>>
>>>> After trying out superLU_dist, I have attached the error there also
>>>> (segmentation error).
>>>>
>>>> Kindly help me.
>>>>
>>>> Venkatesh
>>>>
>>>>
>>>>
>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150529/5ffd1109/attachment.html>


More information about the petsc-users mailing list