[petsc-users] Installation question

Pham Pham pvsang002 at gmail.com
Wed Apr 19 11:34:56 CDT 2017


I reconfigured PETSs with installed MPI, however, I got serous error:

**************************ERROR*************************************
  Error during compile, check arch-linux-cxx-opt/lib/petsc/conf/make.log
  Send it and arch-linux-cxx-opt/lib/petsc/conf/configure.log to
petsc-maint at mcs.anl.gov
********************************************************************

Please explain what is happening?

Thank you very much.




On Wed, Apr 19, 2017 at 11:43 PM, Satish Balay <balay at mcs.anl.gov> wrote:

> Presumably your cluster already has a recommended MPI to use [which is
> already installed. So you should use that - instead of
> --download-mpich=1
>
> Satish
>
> On Wed, 19 Apr 2017, Pham Pham wrote:
>
> > Hi,
> >
> > I just installed petsc-3.7.5 into my university cluster. When evaluating
> > the computer system, PETSc reports "It appears you have 1 node(s)", I
> donot
> > understand this, since the system is a multinodes system. Could you
> please
> > explain this to me?
> >
> > Thank you very much.
> >
> > S.
> >
> > Output:
> > =========================================
> > Now to evaluate the computer systems you plan use - do:
> > make PETSC_DIR=/home/svu/mpepvs/petsc/petsc-3.7.5
> > PETSC_ARCH=arch-linux-cxx-opt streams
> > [mpepvs at atlas7-c10 petsc-3.7.5]$ make
> > PETSC_DIR=/home/svu/mpepvs/petsc/petsc-3.7.5
> PETSC_ARCH=arch-linux-cxx-opt
> > streams
> > cd src/benchmarks/streams; /usr/bin/gmake  --no-print-directory
> > PETSC_DIR=/home/svu/mpepvs/petsc/petsc-3.7.5
> PETSC_ARCH=arch-linux-cxx-opt
> > streams
> > /home/svu/mpepvs/petsc/petsc-3.7.5/arch-linux-cxx-opt/bin/mpicxx -o
> > MPIVersion.o -c -Wall -Wwrite-strings -Wno-strict-aliasing
> > -Wno-unknown-pragmas -fvisibility=hidden -g -O
> > -I/home/svu/mpepvs/petsc/petsc-3.7.5/include
> > -I/home/svu/mpepvs/petsc/petsc-3.7.5/arch-linux-cxx-opt/include
> > `pwd`/MPIVersion.c
> > Running streams with
> > '/home/svu/mpepvs/petsc/petsc-3.7.5/arch-linux-cxx-opt/bin/mpiexec '
> using
> > 'NPMAX=12'
> > Number of MPI processes 1 Processor names  atlas7-c10
> > Triad:         9137.5025   Rate (MB/s)
> > Number of MPI processes 2 Processor names  atlas7-c10 atlas7-c10
> > Triad:         9707.2815   Rate (MB/s)
> > Number of MPI processes 3 Processor names  atlas7-c10 atlas7-c10
> atlas7-c10
> > Triad:        13559.5275   Rate (MB/s)
> > Number of MPI processes 4 Processor names  atlas7-c10 atlas7-c10
> atlas7-c10
> > atlas7-c10
> > Triad:        14193.0597   Rate (MB/s)
> > Number of MPI processes 5 Processor names  atlas7-c10 atlas7-c10
> atlas7-c10
> > atlas7-c10 atlas7-c10
> > Triad:        14492.9234   Rate (MB/s)
> > Number of MPI processes 6 Processor names  atlas7-c10 atlas7-c10
> atlas7-c10
> > atlas7-c10 atlas7-c10 atlas7-c10
> > Triad:        15476.5912   Rate (MB/s)
> > Number of MPI processes 7 Processor names  atlas7-c10 atlas7-c10
> atlas7-c10
> > atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10
> > Triad:        15148.7388   Rate (MB/s)
> > Number of MPI processes 8 Processor names  atlas7-c10 atlas7-c10
> atlas7-c10
> > atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10
> > Triad:        15799.1290   Rate (MB/s)
> > Number of MPI processes 9 Processor names  atlas7-c10 atlas7-c10
> atlas7-c10
> > atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10
> > Triad:        15671.3104   Rate (MB/s)
> > Number of MPI processes 10 Processor names  atlas7-c10 atlas7-c10
> > atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10
> > atlas7-c10 atlas7-c10
> > Triad:        15601.4754   Rate (MB/s)
> > Number of MPI processes 11 Processor names  atlas7-c10 atlas7-c10
> > atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10
> > atlas7-c10 atlas7-c10 atlas7-c10
> > Triad:        15434.5790   Rate (MB/s)
> > Number of MPI processes 12 Processor names  atlas7-c10 atlas7-c10
> > atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10
> > atlas7-c10 atlas7-c10 atlas7-c10 atlas7-c10
> > Triad:        15134.1263   Rate (MB/s)
> > ------------------------------------------------
> > np  speedup
> > 1 1.0
> > 2 1.06
> > 3 1.48
> > 4 1.55
> > 5 1.59
> > 6 1.69
> > 7 1.66
> > 8 1.73
> > 9 1.72
> > 10 1.71
> > 11 1.69
> > 12 1.66
> > Estimation of possible speedup of MPI programs based on Streams
> benchmark.
> > It appears you have 1 node(s)
> > Unable to plot speedup to a file
> > Unable to open matplotlib to plot speedup
> > [mpepvs at atlas7-c10 petsc-3.7.5]$
> > [mpepvs at atlas7-c10 petsc-3.7.5]$
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170420/4a44b2c1/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: make.log
Type: text/x-log
Size: 22382 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170420/4a44b2c1/attachment-0002.bin>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: configure.log
Type: text/x-log
Size: 4405520 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170420/4a44b2c1/attachment-0003.bin>


More information about the petsc-users mailing list