<html>
<head>
<meta content="text/html; charset=UTF-8" http-equiv="Content-Type">
</head>
<body text="#000000" bgcolor="#FFFFFF">
Am 25.04.2012 13:21, schrieb Jed Brown:
<blockquote
cite="mid:CAM9tzSnJNwcB4s0kdtkj5Q0GC6Eif1+i9C_+Qwq42HeBwMW5SA@mail.gmail.com"
type="cite">
<div class="gmail_quote">On Wed, Apr 25, 2012 at 02:21, Thomas
Witkowski <span dir="ltr"><<a moz-do-not-send="true"
href="mailto:thomas.witkowski@tu-dresden.de">thomas.witkowski@tu-dresden.de</a>></span>
wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">
Is it possible to create a parallel matrix (MPIAIJ) with empty
size on some ranks? Even if this should work, does it make
sense (with respect to runtime and efficiency) to collect all
ranks having a nonempty contribution the the global matrix in
an mpi group and to use this communicator for matrix
create/computations?</blockquote>
</div>
<br>
<div>It depends how many ranks are empty. For coarse levels of
multigrid (where this situation arises frequently), it's
sometimes worthwhile to also move the data to a part of the
machine that is topologically "close". Identifying "closeness"
is not always easy because MPI does not necessarily expose this
information in a useful fashion.</div>
<div><br>
</div>
<div>What is the circumstance where this occurs for you?</div>
</blockquote>
Also some coarse space method. When not all ranks contribute to the
coarse space, a matrix defined on the coarse space leads to local
empty matrices on these ranks. So the question is if this leads to
technical troubles in PETSc. The share of nodes participating in the
coarse space may vary.<br>
<br>
Thomas<br>
</body>
</html>