On Tue, Feb 21, 2012 at 10:31 AM, Gerard Gorman <span dir="ltr"><<a href="mailto:g.gorman@imperial.ac.uk">g.gorman@imperial.ac.uk</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Gerard Gorman emailed the following on 21/02/12 16:22:<br>
> Hi<br>
><br>
> I would like to harvest a selection of typical use cases for<br>
> benchmarking PETSc/OpenMP. Ideally they would have features such as<br>
> preallocating matrices rather than adding one at a time, be easy to<br>
> configure for different problem sizes etc. Has anyone already put<br>
> together such a list - what would be considered good/best practice here?<br>
><br>
> Cheers<br>
> Gerard<br>
><br>
Further to this - do I recall someone saying they had unpushed bug fixes<br>
to the test cases? For example, I've tried:<br>
<br>
ggorman@cynic:~/projects/petsc/petsc-dev/src/ksp/ksp/examples/tests$<br>
./ex10 -m 4<br>
m = 2, N=375<br>
[0]PETSC ERROR: --------------------- Error Message<br>
------------------------------------<br>
[0]PETSC ERROR: Argument out of range!<br>
[0]PETSC ERROR: New nonzero at (6,119) caused a malloc!<br>
[0]PETSC ERROR:<br>
------------------------------------------------------------------------<br>
[0]PETSC ERROR: Petsc Development HG revision:<br>
c7afeaa80089c71502abd0b0aca07e7ea2f3d431 HG Date: Tue Feb 21 09:48:53<br>
2012 -0600<br>
[0]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>
[0]PETSC ERROR: See docs/index.html for manual pages.<br>
[0]PETSC ERROR:<br>
------------------------------------------------------------------------<br>
[0]PETSC ERROR: ./ex10 on a arch-linu named cynic by ggorman Tue Feb 21<br>
16:28:33 2012<br>
[0]PETSC ERROR: Libraries linked from<br>
/home/ggorman/projects/petsc/petsc-dev/arch-linux2-c-debug/lib<br>
[0]PETSC ERROR: Configure run at Tue Feb 21 16:23:30 2012<br>
[0]PETSC ERROR: Configure options<br>
[0]PETSC ERROR:<br>
------------------------------------------------------------------------<br>
[0]PETSC ERROR: MatSetValues_SeqAIJ() line 331 in<br>
/home/ggorman/projects/petsc/petsc-dev/src/mat/impls/aij/seq/aij.c<br>
[0]PETSC ERROR: MatSetValues() line 1142 in<br>
/home/ggorman/projects/petsc/petsc-dev/src/mat/interface/matrix.c<br>
[0]PETSC ERROR: AddElement() line 198 in src/ksp/ksp/examples/tests/ex10.c<br>
[0]PETSC ERROR: GetElasticityMatrix() line 130 in<br>
src/ksp/ksp/examples/tests/ex10.c<br>
[0]PETSC ERROR: main() line 41 in src/ksp/ksp/examples/tests/ex10.c<br>
--------------------------------------------------------------------------<br>
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD<br>
with errorcode 63.<br>
<br>
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.<br>
You may or may not see output from other processes, depending on<br>
exactly when Open MPI kills them.<br>
--------------------------------------------------------------------------<br>
<br>
Pretty much the only value for the problem size (ie the -m option) is<br>
the default.<br></blockquote><div><br></div><div>Yes, the preallocation is not done right. We obviously do not use this one.</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Cheers<br>
<span class="HOEnZb"><font color="#888888">Gerard<br>
<br>
</font></span></blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>