[petsc-users] mumps get stuck with parmetis

Wen Jiang jiangwen84 at gmail.com
Tue Apr 24 16:01:41 CDT 2012


Thanks for all your replies.

Firstly, I repeat the ex2 for PETSc 3.2 with the options
-mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2 and it is running without any
problem.

For my problem(3D FEM code), the matrix size is around 0.2 million and
solved by preonly lu. The code is running on 32 processors. I add
-mat_mumps_icntl_4 1 and -info. However, the mumps did not output any
information. The PETSc information output gets stuck at "[0]
VecScatterCreate(): Special case: processor zero gets entire parallel
vector, rest get none". Some more information are attached at end of this
email.

I still cannot locate the problem. Any suggestions? Thanks.

Regards,
Wen

**************************************************************************************
[0] PCSetUp(): Setting up new PC
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[13] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[13] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[6] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[7] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[4] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[8] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[5] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[9] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[10] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[16] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[11] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[11] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[26] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[26] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[18] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[18] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[19] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[19] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[31] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[6] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[7] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[4] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[14] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[14] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[8] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[5] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[9] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[23] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[23] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[15] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[15] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[10] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[24] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[24] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[16] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[25] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[25] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[17] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[17] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[27] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[27] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[12] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[12] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[28] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[28] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[13] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[13] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[20] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[20] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[29] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[29] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[21] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[21] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[30] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[30] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[22] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[22] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[31] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[5] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[5] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[4] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[4] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[31] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[31] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[10] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[11] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[11] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[10] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[7] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[7] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[25] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[25] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[24] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[29] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[24] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[29] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[15] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[15] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[16] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[18] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[16] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[18] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[30] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[30] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[6] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[6] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[8] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[8] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[28] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[28] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[13] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[26] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[13] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[26] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[9] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[9] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[12] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[12] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[27] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[27] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[21] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[20] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[21] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[23] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[20] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[23] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[19] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[19] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[17] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[17] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[22] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[22] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374780
[14] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[14] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374777
[0] VecScatterCreate(): Special case: processor zero gets entire parallel
vector, rest get none
**************************************************************************************************************

On Tue, Apr 24, 2012 at 3:41 PM, <petsc-users-request at mcs.anl.gov> wrote:

> Send petsc-users mailing list submissions to
>        petsc-users at mcs.anl.gov
>
> To subscribe or unsubscribe via the World Wide Web, visit
>        https://lists.mcs.anl.gov/mailman/listinfo/petsc-users
> or, via email, send a message with subject or body 'help' to
>        petsc-users-request at mcs.anl.gov
>
> You can reach the person managing the list at
>        petsc-users-owner at mcs.anl.gov
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of petsc-users digest..."
>
>
> Today's Topics:
>
>   1.  mumps get stuck with parmetis (Wen Jiang)
>   2. Re:  mumps get stuck with parmetis (Hong Zhang)
>   3. Re:  mumps get stuck with parmetis ( Alexander Grayver )
>   4. Re:  mumps get stuck with parmetis (Aron Ahmadia)
>   5. Re:  mumps get stuck with parmetis ( Alexander Grayver )
>   6. Re:  mumps get stuck with parmetis (Aron Ahmadia)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Tue, 24 Apr 2012 13:43:06 -0400
> From: Wen Jiang <jiangwen84 at gmail.com>
> Subject: [petsc-users] mumps get stuck with parmetis
> To: petsc-users at mcs.anl.gov
> Message-ID:
>        <CAMJxm+Da_byqMzm-_j+hRBUJSBt25oQAfDmUnVqHgBS4VPy8_A at mail.gmail.com
> >
> Content-Type: text/plain; charset="iso-8859-1"
>
> Hi,
>
> My code will hang at the solving stage when I use mumps with the runtime
> option -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2. If I remove these two
> options, my code works fine. I am using PETSc 3.2 and configure it
> with --download-mumps=1
> and --download-parmetis=1. Could anyone give me any hints? Thanks.
>
> Regards,
> Wen
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/edd22bfe/attachment-0001.htm
> >
>
> ------------------------------
>
> Message: 2
> Date: Tue, 24 Apr 2012 13:29:32 -0500
> From: Hong Zhang <hzhang at mcs.anl.gov>
> Subject: Re: [petsc-users] mumps get stuck with parmetis
> To: PETSc users list <petsc-users at mcs.anl.gov>
> Message-ID:
>        <CAGCphBurn0+f--gOZ_4MSrLcntvYSVCQ2_cmSG93nvv3hHSEjA at mail.gmail.com
> >
> Content-Type: text/plain; charset="iso-8859-1"
>
> Wen :
> I cannot repeat your error with petsc-dev. Running
> petsc-dev/src/ksp/ksp/examples/tutorials/ex2.c:
> mpiexec -n 3 ./ex2 -pc_type lu -pc_factor_mat_solver_package mumps
> -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2
> Norm of error 1.67272e-15 iterations 1
>
> Can you run above with your petsc-3.2 installation?
>
> Hong
>
> Hi,
> >
> > My code will hang at the solving stage when I use mumps with the runtime
> > option -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2. If I remove these two
> > options, my code works fine. I am using PETSc 3.2 and configure it with
> --download-mumps=1
> > and --download-parmetis=1. Could anyone give me any hints? Thanks.
> >
> > Regards,
> > Wen
> >
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/196ff3c5/attachment-0001.htm
> >
>
> ------------------------------
>
> Message: 3
> Date: Tue, 24 Apr 2012 21:01:45 +0200
> From: " Alexander Grayver " <agrayver at gfz-potsdam.de>
> Subject: Re: [petsc-users] mumps get stuck with parmetis
> To: " PETSc users list " <petsc-users at mcs.anl.gov>
> Message-ID: <auto-000008199816 at cgp2.gfz-potsdam.de>
> Content-Type: text/plain; charset="utf-8"
>
> Can you set:
>
> -mat_mumps_icntl_4 1
>
> And send mumps output?
> Also do you use lu or ilu? How large is your matrix?
>
> Regards,
> Alexander
>
> ----- Reply message -----
> From: "Wen Jiang" <jiangwen84 at gmail.com>
> To: <petsc-users at mcs.anl.gov>
> Subject: [petsc-users] mumps get stuck with parmetis
> Date: Tue, Apr 24, 2012 19:43
> Hi,
>
> My code will hang at the solving stage when I use mumps with the runtime
> option -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2. If I remove these two
> options, my code works fine. I am using PETSc 3.2 and configure it with
> --download-mumps=1 and --download-parmetis=1. Could anyone give me any
> hints? Thanks.
>
>
> Regards,
> Wen
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/2eb1767b/attachment-0001.htm
> >
>
> ------------------------------
>
> Message: 4
> Date: Tue, 24 Apr 2012 22:13:59 +0300
> From: Aron Ahmadia <aron.ahmadia at kaust.edu.sa>
> Subject: Re: [petsc-users] mumps get stuck with parmetis
> To: PETSc users list <petsc-users at mcs.anl.gov>
> Message-ID:
>        <CADVoUZQDYqOpoZtxWY1DGE=WGgbZvxMVuRy8zRxYTtrSCMP3sg at mail.gmail.com
> >
> Content-Type: text/plain; charset="iso-8859-1"
>
> I'm not sure if this is related, but Parmetis+Mumps+PETSc 3.2 on BlueGene/P
> was causing similar behavior without even setting any options.  The only
> way I was able to get a direct solver going was by switching over to
> SuperLU.
>
> A
>
> On Tue, Apr 24, 2012 at 10:01 PM, Alexander Grayver <
> agrayver at gfz-potsdam.de
> > wrote:
>
> > Can you set:
> >
> > -mat_mumps_icntl_4 1
> >
> > And send mumps output?
> > Also do you use lu or ilu? How large is your matrix?
> >
> > Regards,
> > Alexander
> >
> >
> > ----- Reply message -----
> > From: "Wen Jiang" <jiangwen84 at gmail.com>
> > To: <petsc-users at mcs.anl.gov>
> > Subject: [petsc-users] mumps get stuck with parmetis
> > Date: Tue, Apr 24, 2012 19:43
> >
> >
> > Hi,
> >
> > My code will hang at the solving stage when I use mumps with the runtime
> > option -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2. If I remove these two
> > options, my code works fine. I am using PETSc 3.2 and configure it with
> --download-mumps=1
> > and --download-parmetis=1. Could anyone give me any hints? Thanks.
> >
> > Regards,
> > Wen
> >
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/b0503b1c/attachment-0001.htm
> >
>
> ------------------------------
>
> Message: 5
> Date: Tue, 24 Apr 2012 21:22:51 +0200
> From: " Alexander Grayver " <agrayver at gfz-potsdam.de>
> Subject: Re: [petsc-users] mumps get stuck with parmetis
> To: " PETSc users list " <petsc-users at mcs.anl.gov>
> Message-ID: <auto-000056660896 at cgp1.gfz-potsdam.de>
> Content-Type: text/plain; charset="utf-8"
>
> Aron,
>
> This parameter let's to see mumps output in console. The important this to
> understand where mumps hangs, during analysis, factorization or actual
> solution (substitutions)? I'm almost sure it's factorization step. I
> observe this pretty often with mumps compiled with petsc (whereas when
> mumps is used directly it's quite rare to come along with this problem).
>
> Regards,
> Alexander
>
> ----- Reply message -----
> From: "Aron Ahmadia" <aron.ahmadia at kaust.edu.sa>
> To: "PETSc users list" <petsc-users at mcs.anl.gov>
> Subject: [petsc-users] mumps get stuck with parmetis
> Date: Tue, Apr 24, 2012 21:13
> I'm not sure if this is related, but Parmetis+Mumps+PETSc 3.2 on
> BlueGene/P was causing similar behavior without even setting any options.
> ?The only way I was able to get a direct solver going was by switching over
> to SuperLU.
>
> A
>
> On Tue, Apr 24, 2012 at 10:01 PM, Alexander Grayver <
> agrayver at gfz-potsdam.de> wrote:
>
> Can you set:
>
> -mat_mumps_icntl_4 1
>
> And send mumps output?
> Also do you use lu or ilu? How large is your matrix?
>
>
> Regards,
> Alexander
>
> ----- Reply message -----
> From: "Wen Jiang" <jiangwen84 at gmail.com>
>
> To: <petsc-users at mcs.anl.gov>
> Subject: [petsc-users] mumps get stuck with parmetis
> Date: Tue, Apr 24, 2012 19:43
>
>
> Hi,
>
> My code will hang at the solving stage when I use mumps with the runtime
> option -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2. If I remove these two
> options, my code works fine. I am using PETSc 3.2 and configure it with
> --download-mumps=1 and --download-parmetis=1. Could anyone give me any
> hints? Thanks.
>
>
>
> Regards,
> Wen
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/2db311da/attachment-0001.htm
> >
>
> ------------------------------
>
> Message: 6
> Date: Tue, 24 Apr 2012 22:41:00 +0300
> From: Aron Ahmadia <aron.ahmadia at kaust.edu.sa>
> Subject: Re: [petsc-users] mumps get stuck with parmetis
> To: PETSc users list <petsc-users at mcs.anl.gov>
> Message-ID:
>        <CADVoUZSL+UXMJ0V52ELpE2vLLXc6WbZe96+yL5tYPsyMYiLXeQ at mail.gmail.com
> >
> Content-Type: text/plain; charset="iso-8859-1"
>
> /project/k121/sandbox/petsc
> -dev/externalpackages/MUMPS_4.10.0-p3/src/mumps_part9.F:4666
> /project/k121/sandbox/petsc
> -dev/externalpackages/MUMPS_4.10.0-p3/src/dmumps_part5.F:465
> /project/k121/sandbox/petsc
> -dev/externalpackages/MUMPS_4.10.0-p3/src/dmumps_part1.F:409
> /project/k121/sandbox/petsc
> -dev/externalpackages/MUMPS_4.10.0-p3/src/dmumps_part3.F:6651
> /project/k121/sandbox/petsc
> -dev/externalpackages/MUMPS_4.10.0-p3/src/mumps_c.c:422
>
> I don't know the MUMPS source code very well so I couldn't tell you what
> this set of routines are doing, but this is a snippet of the stack trace I
> was seeing when the jobs died on BG/P.
>
> If you set the "-info" flag on a PETSc run, it sends a lot of debugging
> output to the screen, which is useful when you're in a situation where it
> is hard to get access to a debugger or the stack trace.
>
> A
>
> On Tue, Apr 24, 2012 at 10:22 PM, Alexander Grayver <
> agrayver at gfz-potsdam.de
> > wrote:
>
> > Aron,
> >
> > This parameter let's to see mumps output in console. The important this
> to
> > understand where mumps hangs, during analysis, factorization or actual
> > solution (substitutions)? I'm almost sure it's factorization step. I
> > observe this pretty often with mumps compiled with petsc (whereas when
> > mumps is used directly it's quite rare to come along with this problem).
> >
> > Regards,
> > Alexander
> >
> > ----- Reply message -----
> > From: "Aron Ahmadia" <aron.ahmadia at kaust.edu.sa>
> > To: "PETSc users list" <petsc-users at mcs.anl.gov>
> > Subject: [petsc-users] mumps get stuck with parmetis
> > Date: Tue, Apr 24, 2012 21:13
> >
> >
> > I'm not sure if this is related, but Parmetis+Mumps+PETSc 3.2 on
> > BlueGene/P was causing similar behavior without even setting any options.
> >  The only way I was able to get a direct solver going was by switching
> over
> > to SuperLU.
> >
> > A
> >
> > On Tue, Apr 24, 2012 at 10:01 PM, Alexander Grayver <
> > agrayver at gfz-potsdam.de> wrote:
> >
> >> Can you set:
> >>
> >> -mat_mumps_icntl_4 1
> >>
> >> And send mumps output?
> >> Also do you use lu or ilu? How large is your matrix?
> >>
> >> Regards,
> >> Alexander
> >>
> >>
> >> ----- Reply message -----
> >> From: "Wen Jiang" <jiangwen84 at gmail.com>
> >> To: <petsc-users at mcs.anl.gov>
> >> Subject: [petsc-users] mumps get stuck with parmetis
> >> Date: Tue, Apr 24, 2012 19:43
> >>
> >>
> >> Hi,
> >>
> >> My code will hang at the solving stage when I use mumps with the runtime
> >> option -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2. If I remove these
> two
> >> options, my code works fine. I am using PETSc 3.2 and configure it with
> --download-mumps=1
> >> and --download-parmetis=1. Could anyone give me any hints? Thanks.
> >>
> >> Regards,
> >> Wen
> >>
> >
> >
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/3efe54e8/attachment.htm
> >
>
> ------------------------------
>
> _______________________________________________
> petsc-users mailing list
> petsc-users at mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/petsc-users
>
>
> End of petsc-users Digest, Vol 40, Issue 76
> *******************************************
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/b30f9ae1/attachment-0001.htm>


More information about the petsc-users mailing list