<html dir="ltr">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<style id="owaParaStyle" type="text/css">P {margin-top:0;margin-bottom:0;}</style>
</head>
<body ocsi="0" fpstyle="1">
<div style="direction: ltr;font-family: Tahoma;color: #000000;font-size: 10pt;">Hi, Matt<br>
<br>
Thank you. I see the point :-)<br>
<br>
Cheers<br>
<br>
Gao<br>
<div style="font-family: Times New Roman; color: #000000; font-size: 16px">
<hr tabindex="-1">
<div style="direction: ltr;" id="divRpF32683"><font color="#000000" face="Tahoma" size="2"><b>From:</b> petsc-users-bounces@mcs.anl.gov [petsc-users-bounces@mcs.anl.gov] on behalf of Matthew Knepley [knepley@gmail.com]<br>
<b>Sent:</b> Thursday, March 22, 2012 9:38 PM<br>
<b>To:</b> PETSc users list<br>
<b>Subject:</b> Re: [petsc-users] storage of parallel dense matrices and (anti)symmetric matrices<br>
</font><br>
</div>
<div></div>
<div>On Thu, Mar 22, 2012 at 3:32 PM, Gao Bin <span dir="ltr"><<a href="mailto:bin.gao@uit.no" target="_blank">bin.gao@uit.no</a>></span> wrote:<br>
<div class="gmail_quote">
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex; border-left:1px #ccc solid; padding-left:1ex">
<div>
<div style="direction:ltr; font-size:10pt; font-family:Tahoma">Hi, Jed<br>
<br>
Thank you very much for your quick reply. May I ask two more further questions?<br>
<br>
(1) Why does not PETSc also partition the columns so that each processor could use less memory?<br>
</div>
</div>
</blockquote>
<div><br>
</div>
<div>2D distributions are not efficient for sparse matrices. They are sometimes used for dense.</div>
<div> </div>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex; border-left:1px #ccc solid; padding-left:1ex">
<div>
<div style="direction:ltr; font-size:10pt; font-family:Tahoma">(2) If the matrix I use is a square matrix, the number of local columns "n" should be equal to the number of local rows "m" when calling MatCreateMPIDense, am I right?<br>
</div>
</div>
</blockquote>
<div><br>
</div>
<div>Yes. You can always let PETSc choose by giving PETSC_DETERMINE.</div>
<div><br>
</div>
<div> Matt</div>
<div> </div>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex; border-left:1px #ccc solid; padding-left:1ex">
<div>
<div style="direction:ltr; font-size:10pt; font-family:Tahoma">Thank you again for your answer.<br>
<br>
Cheers<br>
<br>
Gao<br>
<div style="font-size:16px; font-family:Times New Roman">
<hr>
<div style="direction:ltr"><font color="#000000" face="Tahoma"><b>From:</b> <a href="mailto:petsc-users-bounces@mcs.anl.gov" target="_blank">
petsc-users-bounces@mcs.anl.gov</a> [<a href="mailto:petsc-users-bounces@mcs.anl.gov" target="_blank">petsc-users-bounces@mcs.anl.gov</a>] on behalf of Jed Brown [<a href="mailto:jedbrown@mcs.anl.gov" target="_blank">jedbrown@mcs.anl.gov</a>]<br>
<b>Sent:</b> Thursday, March 22, 2012 9:17 PM<br>
<b>To:</b> PETSc users list<br>
<b>Subject:</b> Re: [petsc-users] storage of parallel dense matrices and (anti)symmetric matrices<br>
</font><br>
</div>
<div></div>
<div>
<div class="gmail_quote">2012/3/22 Gao Bin <span dir="ltr"><<a href="mailto:bin.gao@uit.no" target="_blank">bin.gao@uit.no</a>></span><br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex; border-left:1px #ccc solid; padding-left:1ex">
<font>"The parallel dense matrices are partitioned by rows across the processors, so that each local rectangular submatrix is stored in the dense format described above."</font>
<font><br>
<br>
Does it mean each processor will have several continuous rows and all columns of the matrix? If yes, why do we need to specify "n" -- the number of local columns when calling MatCreateMPIDense?<br>
</font></blockquote>
<div><br>
</div>
<div>Interpret the local column size n as the local size of the Vec that the Mat will be applied to.</div>
<div><br>
</div>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex; border-left:1px #ccc solid; padding-left:1ex">
<font><br>
I am sorry to raise this simple question, since I have read the manual and tutorials, but I have not found a clear answer. Moreover, the reason I am asking this question is that I would like to use PETSc for matrix operations, but the elements of matrices need
to be calculate via my own code. If I know the distribution of the matrix, I could let each processor only calculate and set local values (the rows and columns possessed on the processor itself) for efficiency.<br>
<br>
My second question is if PETSc provides symmetric and anti-symmetric matrices. I have read the manual, the answer seems to be no. Am I right?</font></blockquote>
</div>
<br>
<div>See the SBAIJ format (it is sparse).</div>
<div><br>
</div>
<div>With a parallel dense matrix, there isn't any point using a symmetric format unless you use a different distribution of the entries.</div>
</div>
</div>
</div>
</div>
</blockquote>
</div>
<br>
<br clear="all">
<div><br>
</div>
-- <br>
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>
</div>
</div>
</div>
</body>
</html>