<div class="gmail_extra">Wen :<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Thanks for all your replies.<br><br>Firstly, I repeat the ex2 for PETSc 3.2 with the options -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2 and it is running without any problem. <br>
<br>For my problem(3D FEM code), the matrix size is around 0.2 million and solved by preonly lu. The code is running on 32 processors. I add -mat_mumps_icntl_4 1 and -info. However, the mumps did not output any information. The PETSc information output gets stuck at "[0] VecScatterCreate(): Special case: processor zero gets entire parallel vector, rest get none". Some more information are attached at end of this email. <br>
</blockquote><div><br></div><div>0.2 million with 32 processors, i.e., each processor holds approx. 6k equations is reasonable size for mumps</div><div>unless lu factorization causes large amount of fill-ins.</div><div>' -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2' activates parallel symbolic factorization and is relatively new</div>
<div>from mumps. Unless symbolic factorization (called analysis phase in mumps) takes large portion of your run,</div><div>which I've never seen from my experiments,why not use default options of petsc interface (i.e., sequential analysis)</div>
<div>and check the output of '-log_summary' to see if you really need optimize this phase by parallelization.</div><div><br></div><div>From the errors reported, hang occurs inside mumps (seems in MPI_ALLREDUCE()).</div>
<div>Suggest report the problem to mumps developers. They are very supportive.</div><div><br></div><div>Hong</div><div><br></div><div> </div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>I still cannot locate the problem. Any suggestions? Thanks.<br><br>Regards,<br>Wen<br><br>**************************************************************************************<br>[0] PCSetUp(): Setting up new PC<br>
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>[13] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[13] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>
[6] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[7] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[4] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[8] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[5] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[9] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[10] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[16] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[11] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[11] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[26] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[26] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[18] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[18] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[19] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[19] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[31] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>[6] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[7] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[4] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[14] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>
[14] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>[8] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[5] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[9] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[23] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>[23] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>
[15] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[15] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[10] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[24] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[24] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[16] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[25] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[25] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[17] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[17] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[27] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[27] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[12] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[12] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[28] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[28] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[13] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[13] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[20] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[20] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[29] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[29] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[21] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[21] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[30] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[30] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[22] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[22] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[31] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>
[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[5] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[5] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[4] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[4] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[3] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>
[2] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>[31] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[31] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[10] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[11] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[11] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[10] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[7] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[7] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[25] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[25] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[24] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[29] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[24] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[29] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[15] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[15] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[16] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[18] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[16] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[18] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[30] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[30] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[6] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>
[6] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>[8] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[8] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[28] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[28] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[13] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[26] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[13] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[26] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[9] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[9] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[12] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[12] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[27] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[27] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[21] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[20] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[21] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[23] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>[20] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[23] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>
[19] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[19] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[17] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[17] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[22] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>[22] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374780<br>
[14] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>[14] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374777<br>[0] VecScatterCreate(): Special case: processor zero gets entire parallel vector, rest get none<br>
**************************************************************************************************************<br><br><div class="gmail_quote">On Tue, Apr 24, 2012 at 3:41 PM, <span dir="ltr"><<a href="mailto:petsc-users-request@mcs.anl.gov" target="_blank">petsc-users-request@mcs.anl.gov</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Send petsc-users mailing list submissions to<br>
<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a><br>
<br>
To subscribe or unsubscribe via the World Wide Web, visit<br>
<a href="https://lists.mcs.anl.gov/mailman/listinfo/petsc-users" target="_blank">https://lists.mcs.anl.gov/mailman/listinfo/petsc-users</a><br>
or, via email, send a message with subject or body 'help' to<br>
<a href="mailto:petsc-users-request@mcs.anl.gov" target="_blank">petsc-users-request@mcs.anl.gov</a><br>
<br>
You can reach the person managing the list at<br>
<a href="mailto:petsc-users-owner@mcs.anl.gov" target="_blank">petsc-users-owner@mcs.anl.gov</a><br>
<br>
When replying, please edit your Subject line so it is more specific<br>
than "Re: Contents of petsc-users digest..."<br>
<br>
<br>
Today's Topics:<br>
<br>
1. mumps get stuck with parmetis (Wen Jiang)<br>
2. Re: mumps get stuck with parmetis (Hong Zhang)<br>
3. Re: mumps get stuck with parmetis ( Alexander Grayver )<br>
4. Re: mumps get stuck with parmetis (Aron Ahmadia)<br>
5. Re: mumps get stuck with parmetis ( Alexander Grayver )<br>
6. Re: mumps get stuck with parmetis (Aron Ahmadia)<br>
<br>
<br>
----------------------------------------------------------------------<br>
<br>
Message: 1<br>
Date: Tue, 24 Apr 2012 13:43:06 -0400<br>
From: Wen Jiang <<a href="mailto:jiangwen84@gmail.com" target="_blank">jiangwen84@gmail.com</a>><div class="im"><br>
Subject: [petsc-users] mumps get stuck with parmetis<br></div>
To: <a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a><br>
Message-ID:<br>
<<a href="mailto:CAMJxm%2BDa_byqMzm-_j%2BhRBUJSBt25oQAfDmUnVqHgBS4VPy8_A@mail.gmail.com" target="_blank">CAMJxm+Da_byqMzm-_j+hRBUJSBt25oQAfDmUnVqHgBS4VPy8_A@mail.gmail.com</a>><br>
Content-Type: text/plain; charset="iso-8859-1"<div class="im"><br>
<br>
Hi,<br>
<br>
My code will hang at the solving stage when I use mumps with the runtime<br>
option -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2. If I remove these two<br>
options, my code works fine. I am using PETSc 3.2 and configure it<br>
with --download-mumps=1<br>
and --download-parmetis=1. Could anyone give me any hints? Thanks.<br>
<br>
Regards,<br>
Wen<br></div>
-------------- next part --------------<br>
An HTML attachment was scrubbed...<br>
URL: <<a href="http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/edd22bfe/attachment-0001.htm" target="_blank">http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/edd22bfe/attachment-0001.htm</a>><br>
<br>
------------------------------<br>
<br>
Message: 2<br>
Date: Tue, 24 Apr 2012 13:29:32 -0500<br>
From: Hong Zhang <<a href="mailto:hzhang@mcs.anl.gov" target="_blank">hzhang@mcs.anl.gov</a>><br>
Subject: Re: [petsc-users] mumps get stuck with parmetis<div class="im"><br>
To: PETSc users list <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br></div>
Message-ID:<br>
<<a href="mailto:CAGCphBurn0%2Bf--gOZ_4MSrLcntvYSVCQ2_cmSG93nvv3hHSEjA@mail.gmail.com" target="_blank">CAGCphBurn0+f--gOZ_4MSrLcntvYSVCQ2_cmSG93nvv3hHSEjA@mail.gmail.com</a>><br>
Content-Type: text/plain; charset="iso-8859-1"<div class="im"><br>
<br>
Wen :<br>
I cannot repeat your error with petsc-dev. Running<br>
petsc-dev/src/ksp/ksp/examples/tutorials/ex2.c:<br>
mpiexec -n 3 ./ex2 -pc_type lu -pc_factor_mat_solver_package mumps<br>
-mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2<br>
Norm of error 1.67272e-15 iterations 1<br>
<br>
Can you run above with your petsc-3.2 installation?<br>
<br>
Hong<br>
<br></div><div class="im">
Hi,<br>
><br>
> My code will hang at the solving stage when I use mumps with the runtime<br>
> option -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2. If I remove these two<br>
> options, my code works fine. I am using PETSc 3.2 and configure it with --download-mumps=1<br>
> and --download-parmetis=1. Could anyone give me any hints? Thanks.<br>
><br>
> Regards,<br>
> Wen<br>
><br></div>
-------------- next part --------------<br>
An HTML attachment was scrubbed...<br>
URL: <<a href="http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/196ff3c5/attachment-0001.htm" target="_blank">http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/196ff3c5/attachment-0001.htm</a>><br>
<br>
------------------------------<br>
<br>
Message: 3<br>
Date: Tue, 24 Apr 2012 21:01:45 +0200<br>
From: " Alexander Grayver " <<a href="mailto:agrayver@gfz-potsdam.de" target="_blank">agrayver@gfz-potsdam.de</a>><br>
Subject: Re: [petsc-users] mumps get stuck with parmetis<div class="im"><br>
To: " PETSc users list " <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br></div>
Message-ID: <<a href="mailto:auto-000008199816@cgp2.gfz-potsdam.de" target="_blank">auto-000008199816@cgp2.gfz-potsdam.de</a>><br>
Content-Type: text/plain; charset="utf-8"<div class="im"><br>
<br>
Can you set:<br>
<br>
-mat_mumps_icntl_4 1<br>
<br>
And send mumps output?<br>
Also do you use lu or ilu? How large is your matrix?<br>
<br>
Regards,<br>
Alexander<br>
<br>
----- Reply message -----<br>
From: "Wen Jiang" <<a href="mailto:jiangwen84@gmail.com" target="_blank">jiangwen84@gmail.com</a>><br>
To: <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br>
Subject: [petsc-users] mumps get stuck with parmetis<br>
Date: Tue, Apr 24, 2012 19:43<br>
Hi,<br>
<br>
My code will hang at the solving stage when I use mumps with the runtime option -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2. If I remove these two options, my code works fine. I am using PETSc 3.2 and configure it with --download-mumps=1 and --download-parmetis=1. Could anyone give me any hints? Thanks.<br>
<br>
<br>
Regards,<br>
Wen<br></div>
-------------- next part --------------<br>
An HTML attachment was scrubbed...<br>
URL: <<a href="http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/2eb1767b/attachment-0001.htm" target="_blank">http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/2eb1767b/attachment-0001.htm</a>><br>
<br>
------------------------------<br>
<br>
Message: 4<br>
Date: Tue, 24 Apr 2012 22:13:59 +0300<br>
From: Aron Ahmadia <<a href="mailto:aron.ahmadia@kaust.edu.sa" target="_blank">aron.ahmadia@kaust.edu.sa</a>><br>
Subject: Re: [petsc-users] mumps get stuck with parmetis<div class="im"><br>
To: PETSc users list <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br></div>
Message-ID:<br>
<CADVoUZQDYqOpoZtxWY1DGE=<a href="mailto:WGgbZvxMVuRy8zRxYTtrSCMP3sg@mail.gmail.com" target="_blank">WGgbZvxMVuRy8zRxYTtrSCMP3sg@mail.gmail.com</a>><br>
Content-Type: text/plain; charset="iso-8859-1"<div><div class="h5"><br>
<br>
I'm not sure if this is related, but Parmetis+Mumps+PETSc 3.2 on BlueGene/P<br>
was causing similar behavior without even setting any options. The only<br>
way I was able to get a direct solver going was by switching over to<br>
SuperLU.<br>
<br>
A<br>
<br>
On Tue, Apr 24, 2012 at 10:01 PM, Alexander Grayver <<a href="mailto:agrayver@gfz-potsdam.de" target="_blank">agrayver@gfz-potsdam.de</a><br>
> wrote:<br>
<br>
> Can you set:<br>
><br>
> -mat_mumps_icntl_4 1<br>
><br>
> And send mumps output?<br>
> Also do you use lu or ilu? How large is your matrix?<br>
><br>
> Regards,<br>
> Alexander<br>
><br>
><br>
> ----- Reply message -----<br>
> From: "Wen Jiang" <<a href="mailto:jiangwen84@gmail.com" target="_blank">jiangwen84@gmail.com</a>><br>
> To: <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br>
> Subject: [petsc-users] mumps get stuck with parmetis<br>
> Date: Tue, Apr 24, 2012 19:43<br>
><br>
><br>
> Hi,<br>
><br>
> My code will hang at the solving stage when I use mumps with the runtime<br>
> option -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2. If I remove these two<br>
> options, my code works fine. I am using PETSc 3.2 and configure it with --download-mumps=1<br>
> and --download-parmetis=1. Could anyone give me any hints? Thanks.<br>
><br>
> Regards,<br>
> Wen<br>
><br></div></div>
-------------- next part --------------<br>
An HTML attachment was scrubbed...<br>
URL: <<a href="http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/b0503b1c/attachment-0001.htm" target="_blank">http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/b0503b1c/attachment-0001.htm</a>><br>
<br>
------------------------------<br>
<br>
Message: 5<br>
Date: Tue, 24 Apr 2012 21:22:51 +0200<br>
From: " Alexander Grayver " <<a href="mailto:agrayver@gfz-potsdam.de" target="_blank">agrayver@gfz-potsdam.de</a>><br>
Subject: Re: [petsc-users] mumps get stuck with parmetis<div class="im"><br>
To: " PETSc users list " <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br></div>
Message-ID: <<a href="mailto:auto-000056660896@cgp1.gfz-potsdam.de" target="_blank">auto-000056660896@cgp1.gfz-potsdam.de</a>><br>
Content-Type: text/plain; charset="utf-8"<div class="im"><br>
<br>
Aron,<br>
<br>
This parameter let's to see mumps output in console. The important this to understand where mumps hangs, during analysis, factorization or actual solution (substitutions)? I'm almost sure it's factorization step. I observe this pretty often with mumps compiled with petsc (whereas when mumps is used directly it's quite rare to come along with this problem).<br>
<br>
Regards,<br>
Alexander<br>
<br>
----- Reply message -----<br>
From: "Aron Ahmadia" <<a href="mailto:aron.ahmadia@kaust.edu.sa" target="_blank">aron.ahmadia@kaust.edu.sa</a>><br>
To: "PETSc users list" <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br>
Subject: [petsc-users] mumps get stuck with parmetis<br>
Date: Tue, Apr 24, 2012 21:13<br></div><div class="im">
I'm not sure if this is related, but Parmetis+Mumps+PETSc 3.2 on BlueGene/P was causing similar behavior without even setting any options. ?The only way I was able to get a direct solver going was by switching over to SuperLU.<br>
<br>
A<br>
<br>
On Tue, Apr 24, 2012 at 10:01 PM, Alexander Grayver <<a href="mailto:agrayver@gfz-potsdam.de" target="_blank">agrayver@gfz-potsdam.de</a>> wrote:<br>
<br>
Can you set:<br>
<br>
-mat_mumps_icntl_4 1<br>
<br>
And send mumps output?<br>
Also do you use lu or ilu? How large is your matrix?<br>
<br>
<br>
Regards,<br>
Alexander<br>
<br>
----- Reply message -----<br>
From: "Wen Jiang" <<a href="mailto:jiangwen84@gmail.com" target="_blank">jiangwen84@gmail.com</a>><br>
<br>
To: <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br>
Subject: [petsc-users] mumps get stuck with parmetis<br>
Date: Tue, Apr 24, 2012 19:43<br>
<br>
<br>
Hi,<br>
<br>
My code will hang at the solving stage when I use mumps with the runtime option -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2. If I remove these two options, my code works fine. I am using PETSc 3.2 and configure it with --download-mumps=1 and --download-parmetis=1. Could anyone give me any hints? Thanks.<br>
<br>
<br>
<br>
Regards,<br>
Wen<br></div>
-------------- next part --------------<br>
An HTML attachment was scrubbed...<br>
URL: <<a href="http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/2db311da/attachment-0001.htm" target="_blank">http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/2db311da/attachment-0001.htm</a>><br>
<br>
------------------------------<br>
<br>
Message: 6<br>
Date: Tue, 24 Apr 2012 22:41:00 +0300<br>
From: Aron Ahmadia <<a href="mailto:aron.ahmadia@kaust.edu.sa" target="_blank">aron.ahmadia@kaust.edu.sa</a>><br>
Subject: Re: [petsc-users] mumps get stuck with parmetis<div class="im"><br>
To: PETSc users list <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br></div>
Message-ID:<br>
<<a href="mailto:CADVoUZSL%2BUXMJ0V52ELpE2vLLXc6WbZe96%2ByL5tYPsyMYiLXeQ@mail.gmail.com" target="_blank">CADVoUZSL+UXMJ0V52ELpE2vLLXc6WbZe96+yL5tYPsyMYiLXeQ@mail.gmail.com</a>><br>
Content-Type: text/plain; charset="iso-8859-1"<br>
<br>
/project/k121/sandbox/petsc<br>
-dev/externalpackages/MUMPS_4.10.0-p3/src/mumps_part9.F:4666<br>
/project/k121/sandbox/petsc<br>
-dev/externalpackages/MUMPS_4.10.0-p3/src/dmumps_part5.F:465<br>
/project/k121/sandbox/petsc<br>
-dev/externalpackages/MUMPS_4.10.0-p3/src/dmumps_part1.F:409<br>
/project/k121/sandbox/petsc<br>
-dev/externalpackages/MUMPS_4.10.0-p3/src/dmumps_part3.F:6651<br>
/project/k121/sandbox/petsc<div><div class="h5"><br>
-dev/externalpackages/MUMPS_4.10.0-p3/src/mumps_c.c:422<br>
<br>
I don't know the MUMPS source code very well so I couldn't tell you what<br>
this set of routines are doing, but this is a snippet of the stack trace I<br>
was seeing when the jobs died on BG/P.<br>
<br>
If you set the "-info" flag on a PETSc run, it sends a lot of debugging<br>
output to the screen, which is useful when you're in a situation where it<br>
is hard to get access to a debugger or the stack trace.<br>
<br>
A<br>
<br>
On Tue, Apr 24, 2012 at 10:22 PM, Alexander Grayver <<a href="mailto:agrayver@gfz-potsdam.de" target="_blank">agrayver@gfz-potsdam.de</a><br>
> wrote:<br>
<br>
> Aron,<br>
><br>
> This parameter let's to see mumps output in console. The important this to<br>
> understand where mumps hangs, during analysis, factorization or actual<br>
> solution (substitutions)? I'm almost sure it's factorization step. I<br>
> observe this pretty often with mumps compiled with petsc (whereas when<br>
> mumps is used directly it's quite rare to come along with this problem).<br>
><br>
> Regards,<br>
> Alexander<br>
><br>
> ----- Reply message -----<br>
> From: "Aron Ahmadia" <<a href="mailto:aron.ahmadia@kaust.edu.sa" target="_blank">aron.ahmadia@kaust.edu.sa</a>><br>
> To: "PETSc users list" <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br>
> Subject: [petsc-users] mumps get stuck with parmetis<br>
> Date: Tue, Apr 24, 2012 21:13<br>
><br>
><br>
> I'm not sure if this is related, but Parmetis+Mumps+PETSc 3.2 on<br>
> BlueGene/P was causing similar behavior without even setting any options.<br>
> The only way I was able to get a direct solver going was by switching over<br>
> to SuperLU.<br>
><br>
> A<br>
><br>
> On Tue, Apr 24, 2012 at 10:01 PM, Alexander Grayver <<br>
> <a href="mailto:agrayver@gfz-potsdam.de" target="_blank">agrayver@gfz-potsdam.de</a>> wrote:<br>
><br>
>> Can you set:<br>
>><br>
>> -mat_mumps_icntl_4 1<br>
>><br>
>> And send mumps output?<br>
>> Also do you use lu or ilu? How large is your matrix?<br>
>><br>
>> Regards,<br>
>> Alexander<br>
>><br>
>><br>
>> ----- Reply message -----<br>
>> From: "Wen Jiang" <<a href="mailto:jiangwen84@gmail.com" target="_blank">jiangwen84@gmail.com</a>><br>
>> To: <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br>
>> Subject: [petsc-users] mumps get stuck with parmetis<br>
>> Date: Tue, Apr 24, 2012 19:43<br>
>><br>
>><br>
>> Hi,<br>
>><br>
>> My code will hang at the solving stage when I use mumps with the runtime<br>
>> option -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2. If I remove these two<br>
>> options, my code works fine. I am using PETSc 3.2 and configure it with --download-mumps=1<br>
>> and --download-parmetis=1. Could anyone give me any hints? Thanks.<br>
>><br>
>> Regards,<br>
>> Wen<br>
>><br>
><br>
><br></div></div>
-------------- next part --------------<br>
An HTML attachment was scrubbed...<br>
URL: <<a href="http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/3efe54e8/attachment.htm" target="_blank">http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120424/3efe54e8/attachment.htm</a>><br>
<br>
------------------------------<br>
<br>
_______________________________________________<br>
petsc-users mailing list<br>
<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a><br>
<a href="https://lists.mcs.anl.gov/mailman/listinfo/petsc-users" target="_blank">https://lists.mcs.anl.gov/mailman/listinfo/petsc-users</a><br>
<br>
<br>
End of petsc-users Digest, Vol 40, Issue 76<br>
*******************************************<br>
</blockquote></div><br>
</blockquote></div><br></div>