[petsc-users] running error
paul zhang
paulhuaizhang at gmail.com
Mon Dec 1 15:08:19 CST 2014
Hi Jed,
Does this mean I've passed the default test? Is the "open matplotlib " an
issue?
Thanks,
Paul
[hzh225 at dlxlogin2-2 petsc-3.5.2]$ make
PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2 PETSC_ARCH=linux-gnu-intel
streams NPMAX=2
cd src/benchmarks/streams; /usr/bin/gmake --no-print-directory streams
/home/hzh225/LIB_CFD/nP/petsc-3.5.2/linux-gnu-intel/bin/mpicc -o
MPIVersion.o -c -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing
-Wno-unknown-pragmas -g3 -O0
-I/home/hzh225/LIB_CFD/nP/petsc-3.5.2/include
-I/home/hzh225/LIB_CFD/nP/petsc-3.5.2/linux-gnu-intel/include
-I/share/cluster/RHEL6.2/x86_64/apps/valgrind/3.9.0/include
`pwd`/MPIVersion.c
/home/hzh225/LIB_CFD/nP/petsc-3.5.2/src/benchmarks/streams/MPIVersion.c: In
function ‘main’:
/home/hzh225/LIB_CFD/nP/petsc-3.5.2/src/benchmarks/streams/MPIVersion.c:99:
warning: value computed is not used
/home/hzh225/LIB_CFD/nP/petsc-3.5.2/src/benchmarks/streams/MPIVersion.c:103:
warning: value computed is not used
Number of MPI processes 1
Process 0 dlxlogin2-2.local
Function Rate (MB/s)
Copy: 10939.5817
Scale: 10774.4825
Add: 12114.9712
Triad: 11215.3413
Number of MPI processes 2
Process 0 dlxlogin2-2.local
Process 1 dlxlogin2-2.local
Function Rate (MB/s)
Copy: 20189.9660
Scale: 19714.0058
Add: 22403.2262
Triad: 21046.0602
------------------------------------------------
np speedup
1 1.0
2 1.88
Estimation of possible speedup of MPI programs based on Streams benchmark.
It appears you have 1 node(s)
Unable to open matplotlib to plot speedup
Unable to open matplotlib to plot speedup
Huaibao (Paul) Zhang
*Gas Surface Interactions Lab*
Department of Mechanical Engineering
University of Kentucky,
Lexington,
KY, 40506-0503
*Office*: 216 Ralph G. Anderson Building
*Web*:gsil.engineering.uky.edu
On Sun, Nov 30, 2014 at 10:28 PM, Jed Brown <jed at jedbrown.org> wrote:
> Please keep the discussion on-list.
>
> paul zhang <paulhuaizhang at gmail.com> writes:
>
> > I set some breakpoints, which shows the code breaks down at the
> > PetscInitialize(&argc,&argv,(char *)0,help);
>
> Run in a debugger and send a stack trace. This is most likely due to a
> misconfigured environment/MPI.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141201/c1d7e596/attachment-0001.html>
More information about the petsc-users
mailing list