<html><body><div style="color:#000; background-color:#fff; font-family:times new roman, new york, times, serif;font-size:12pt"><div><span>I think the sentence:</span></div><div><br><span></span></div><div>The matrix has row and column ownership ranges.</div><div><br></div><div>is very important. I did not get it from the manual (maybe my fault).</div><div><br></div><div>Kind regards,</div><div>Pham Van Ha<br></div><div><br></div> <div style="font-family: times new roman, new york, times, serif; font-size: 12pt;"> <div style="font-family: times new roman, new york, times, serif; font-size: 12pt;"> <div dir="ltr"> <font face="Arial" size="2"> <hr size="1"> <b><span style="font-weight:bold;">From:</span></b> "petsc-users-request@mcs.anl.gov" <petsc-users-request@mcs.anl.gov><br> <b><span style="font-weight: bold;">To:</span></b> petsc-users@mcs.anl.gov <br> <b><span style="font-weight: bold;">Sent:</span></b> Sunday, April 29, 2012 12:00 AM<br>
<b><span style="font-weight: bold;">Subject:</span></b> petsc-users Digest, Vol 40, Issue 107<br> </font> </div> <br>
Send petsc-users mailing list submissions to<br> <a ymailto="mailto:petsc-users@mcs.anl.gov" href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a><br><br>To subscribe or unsubscribe via the World Wide Web, visit<br> <a href="https://lists.mcs.anl.gov/mailman/listinfo/petsc-users" target="_blank">https://lists.mcs.anl.gov/mailman/listinfo/petsc-users</a><br>or, via email, send a message with subject or body 'help' to<br> <a ymailto="mailto:petsc-users-request@mcs.anl.gov" href="mailto:petsc-users-request@mcs.anl.gov">petsc-users-request@mcs.anl.gov</a><br><br>You can reach the person managing the list at<br> <a ymailto="mailto:petsc-users-owner@mcs.anl.gov" href="mailto:petsc-users-owner@mcs.anl.gov">petsc-users-owner@mcs.anl.gov</a><br><br>When replying, please edit your Subject line so it is more specific<br>than "Re: Contents of petsc-users digest..."<br><br><br>Today's
Topics:<br><br> 1. Re: Preallocation for regtangular matrix (Jed Brown)<br> 2. Re: mumps solve with same nonzero pattern (Alexander Grayver)<br> 3. superlu dist colperm by default (Wen Jiang)<br><br><br>----------------------------------------------------------------------<br><br>Message: 1<br>Date: Sat, 28 Apr 2012 05:05:22 -0500<br>From: Jed Brown <<a ymailto="mailto:jedbrown@mcs.anl.gov" href="mailto:jedbrown@mcs.anl.gov">jedbrown@mcs.anl.gov</a>><br>Subject: Re: [petsc-users] Preallocation for regtangular matrix<br>To: Pham Van <<a ymailto="mailto:pham_ha@yahoo.com" href="mailto:pham_ha@yahoo.com">pham_ha@yahoo.com</a>>, PETSc users list<br> <<a ymailto="mailto:petsc-users@mcs.anl.gov" href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>><br>Message-ID:<br> <<a
ymailto="mailto:CAM9tzSnXqs5Uzx1z9Lu2BMGFrRe3PsjQpbJaMcdeDPZmQDb7vQ@mail.gmail.com" href="mailto:CAM9tzSnXqs5Uzx1z9Lu2BMGFrRe3PsjQpbJaMcdeDPZmQDb7vQ@mail.gmail.com">CAM9tzSnXqs5Uzx1z9Lu2BMGFrRe3PsjQpbJaMcdeDPZmQDb7vQ@mail.gmail.com</a>><br>Content-Type: text/plain; charset="utf-8"<br><br>On Sat, Apr 28, 2012 at 03:32, Pham Van <<a ymailto="mailto:pham_ha@yahoo.com" href="mailto:pham_ha@yahoo.com">pham_ha@yahoo.com</a>> wrote:<br><br>> I am trying to create matrices for a rectangular matrix with number of<br>> rows is much bigger than number of columns.<br>><br>> I create 2 matrices together one square and one rectangular with the same<br>> number of rows. To my surprise the smaller matrix (rectangular) creation<br>> take much more time than the bigger one (rectangular). I did preallocate<br>> for both matrices with predefine number of diagonal and off-diagonal<br>> entries. But then again I did not know how to define
diagonal part of a<br>> rectangular matrix. Only a very small top part of the matrix is "diagonal".<br>><br>> I have try both method: the first one to set diagonal and off-diagonal<br>> part as it was a square matrix; and second to set only a small top part of<br>> the matrix diagonal. Both method does not work.<br>><br>> Does anyone know how to preallocate a rectangular matrix.<br>><br>> By the way, the same code run pretty fast when I run with single process<br>> and painfully slow when 2 processes employed.<br>><br><br>The matrix has row and column ownership ranges. The "diagonal" part is any<br>entry that lies in both the row and column ownership range. The<br>"off-diagonal" part is in the row ownership range, but not in the column<br>ownership range.<br><br>Run with -mat_new_nonzero_allocation_err or<br><br>ierr = MatSetOption(B,MAT_NEW_NONZERO_ALLOCATION_ERR,flg);CHKERRQ(ierr);<br><br>to find which entries start
going outside your preallocation.<br>-------------- next part --------------<br>An HTML attachment was scrubbed...<br>URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120428/208b1c4c/attachment-0001.htm><br><br>------------------------------<br><br>Message: 2<br>Date: Sat, 28 Apr 2012 16:03:58 +0200<br>From: Alexander Grayver <<a ymailto="mailto:agrayver@gfz-potsdam.de" href="mailto:agrayver@gfz-potsdam.de">agrayver@gfz-potsdam.de</a>><br>Subject: Re: [petsc-users] mumps solve with same nonzero pattern<br>To: PETSc users list <<a ymailto="mailto:petsc-users@mcs.anl.gov" href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>><br>Message-ID: <<a ymailto="mailto:4F9BF8CE.7030005@gfz-potsdam.de" href="mailto:4F9BF8CE.7030005@gfz-potsdam.de">4F9BF8CE.7030005@gfz-potsdam.de</a>><br>Content-Type: text/plain; charset=ISO-8859-1; format=flowed<br><br>This is not quite normal.<br><br>Right now I'm dealing with
0.65 million problem and it takes 15 secs for <br>symbolic factorization to be done using 64 cores (four 16x cores nodes).<br><br>As Hong suggested you should try different ordering techniques. Usually <br>metis is among the best (-mat_mumps_icntl_7 5). And also any timings is <br>better to do in release version (--with-debugging=0 flag for configure).<br><br>You should be able to get some output and statistics from MUMPS with this:<br>-mat_mumps_icntl_3 2 -mat_mumps_icntl_4 2<br><br>If you don't read MUMPS manual for that. Without any output from MUMPS <br>it is not really clear what happens.<br><br>On 27.04.2012 19:49, Wen Jiang wrote:<br>> Thanks for your reply. I also tested the exactly same problem with <br>> SUPERLU. The mumps and superlu dist both are using sequential symbolic <br>> factorization. However, superlu dist takes only 20 seconds but mumps <br>> takes almost 700 seconds. I am wondering whether such a big difference <br>>
is possible. Do those two direct solver use quite different algorithm?<br>><br>> And also since I might have the same nonzero structure system to be <br>> solved many times at different places. I am wondering whether I could <br>> save the symbolic factorization output somewhere and then read them as <br>> the input for future solving. Thanks.<br>><br>> Regards,<br>> Wen<br>><br><br><br>-- <br>Regards,<br>Alexander<br><br><br><br>------------------------------<br><br>Message: 3<br>Date: Sat, 28 Apr 2012 13:00:06 -0400<br>From: Wen Jiang <<a ymailto="mailto:jiangwen84@gmail.com" href="mailto:jiangwen84@gmail.com">jiangwen84@gmail.com</a>><br>Subject: [petsc-users] superlu dist colperm by default<br>To: <a ymailto="mailto:petsc-users@mcs.anl.gov" href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a><br>Message-ID:<br> <CAMJxm+<a
ymailto="mailto:DQwYW-6prQHAXPCuC3buvkq7md5yVaHqTMtprBjEjDsA@mail.gmail.com" href="mailto:DQwYW-6prQHAXPCuC3buvkq7md5yVaHqTMtprBjEjDsA@mail.gmail.com">DQwYW-6prQHAXPCuC3buvkq7md5yVaHqTMtprBjEjDsA@mail.gmail.com</a>><br>Content-Type: text/plain; charset="iso-8859-1"<br><br>Hi,<br><br>I am using superlu dist in PETSc 3.1. The version of superlu dist installed<br>by petsc is SuperLU_DIST_2.4-hg-v2. Does anyone know which colperm method<br>superlu dist use for the default setting by petsc? I cannot got such<br>information by using -mat_superlu_dist_statprint, the output of which I<br>attached below. Thanks.<br><br>Regards,<br>Wen<br><br>***************************************************************************************************************<br>PC Object:<br> type: lu<br> LU: out-of-place factorization<br> tolerance for zero pivot 1e-12<br> matrix ordering: natural<br> factor fill ratio given
0, needed 0<br> Factored matrix follows:<br> Matrix Object:<br> type=mpiaij, rows=215883, cols=215883<br> package used to perform factorization: superlu_dist<br> total: nonzeros=0, allocated nonzeros=431766<br> SuperLU_DIST run parameters:<br> Process grid nprow 8 x npcol 8<br> Equilibrate matrix TRUE<br> Matrix input mode 1<br> Replace tiny pivots TRUE<br> Use iterative refinement FALSE<br> Processors in row 8 col partition 8<br> Row
permutation LargeDiag<br> Parallel symbolic factorization FALSE<br> Repeated factorization SamePattern<br>******************************************************************************************************************<br>-------------- next part --------------<br>An HTML attachment was scrubbed...<br>URL: <<a href="http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120428/599f3cb2/attachment-0001.htm" target="_blank">http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120428/599f3cb2/attachment-0001.htm</a>><br><br>------------------------------<br><br>_______________________________________________<br>petsc-users mailing list<br><a ymailto="mailto:petsc-users@mcs.anl.gov" href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a><br><a href="https://lists.mcs.anl.gov/mailman/listinfo/petsc-users"
target="_blank">https://lists.mcs.anl.gov/mailman/listinfo/petsc-users</a><br><br><br>End of petsc-users Digest, Vol 40, Issue 107<br>********************************************<br><br><br> </div> </div> </div></body></html>