[petsc-users] Petsc/3.5.0 make test failed
Barry Smith
bsmith at mcs.anl.gov
Fri Jul 25 15:50:18 CDT 2014
Well it is running. It is just producing annoying warning messages. You need to talk to your local MPI expert on that system for how to get rid of the problem.
Barry
On Jul 25, 2014, at 3:23 PM, Kai Song <ksong at lbl.gov> wrote:
> Hi Barry,
>
> I indeed deleted some of the OpenMPI related warning messages, it should be irrelevant. Here is the full output:
> ======================
> [kaisong at n0009 petsc-3.5.0]$ sh petsc-3.5.0-gcc.sh
> Running test examples to verify correct installation
> Using PETSC_DIR=/clusterfs/voltaire/home/software/modules/petsc/3.5.0 and PETSC_ARCH=arch-linux2-c-debug
> Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI process
> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> --------------------------------------------------------------------------
> WARNING: There is at least non-excluded one OpenFabrics device found,
> but there are no active ports detected (or Open MPI was unable to use
> them). This is most certainly not what you wanted. Check your
> cables, subnet manager configuration, etc. The openib BTL will be
> ignored for this job.
>
> Local host: n0009.scs00
> --------------------------------------------------------------------------
> lid velocity = 0.0016, prandtl # = 1, grashof # = 1
> Number of SNES iterations = 2
> Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2 MPI processes
> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> --------------------------------------------------------------------------
> WARNING: There is at least non-excluded one OpenFabrics device found,
> but there are no active ports detected (or Open MPI was unable to use
> them). This is most certainly not what you wanted. Check your
> cables, subnet manager configuration, etc. The openib BTL will be
> ignored for this job.
>
> Local host: n0009.scs00
> --------------------------------------------------------------------------
> lid velocity = 0.0016, prandtl # = 1, grashof # = 1
> Number of SNES iterations = 2
> [n0009.scs00:07638] 1 more process has sent help message help-mpi-btl-openib.txt / no active ports found
> [n0009.scs00:07638] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
> Possible error running Fortran example src/snes/examples/tutorials/ex5f with 1 MPI process
> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> --------------------------------------------------------------------------
> WARNING: There is at least non-excluded one OpenFabrics device found,
> but there are no active ports detected (or Open MPI was unable to use
> them). This is most certainly not what you wanted. Check your
> cables, subnet manager configuration, etc. The openib BTL will be
> ignored for this job.
>
> Local host: n0009.scs00
> --------------------------------------------------------------------------
> Number of SNES iterations = 4
> Completed test examples
> =========================================
> Now to evaluate the computer systems you plan use - do:
> make PETSC_DIR=/clusterfs/voltaire/home/software/modules/petsc/3.5.0 PETSC_ARCH=arch-linux2-c-debug streams NPMAX=<number of MPI processes you intend to use>
> ======================
>
> I also followed your suggestion and built ex19 separately, and made sure using the same compiler that built petsc. Here is the complete output:
> ======================
> [kaisong at n0009 tutorials]$ mpiexec -n 2 ./ex19
> --------------------------------------------------------------------------
> WARNING: There is at least non-excluded one OpenFabrics device found,
> but there are no active ports detected (or Open MPI was unable to use
> them). This is most certainly not what you wanted. Check your
> cables, subnet manager configuration, etc. The openib BTL will be
> ignored for this job.
>
> Local host: n0009.scs00
> --------------------------------------------------------------------------
> lid velocity = 0.0625, prandtl # = 1, grashof # = 1
> Number of SNES iterations = 2
> [n0009.scs00:09124] 1 more process has sent help message help-mpi-btl-openib.txt / no active ports found
> [n0009.scs00:09124] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
> ======================
>
> Thanks,
>
> Kai
>
>
>
> On Fri, Jul 25, 2014 at 1:32 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> On Jul 25, 2014, at 12:23 PM, Kai Song <ksong at lbl.gov> wrote:
>
> > Hi All,
> >
> > Thanks for all the suggestions! I took Satish's advice, removed the source, and did a fresh built.
> >
> > It got through the test compiling, but the tests failed as follow:
> > ==================
> > Running test examples to verify correct installation
> > Using PETSC_DIR=/clusterfs/voltaire/home/software/modules/petsc/3.5.0 and PETSC_ARCH=arch-linux2-c-debug
> > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI process
> > See http://www.mcs.anl.gov/petsc/documentation/faq.html
> > lid velocity = 0.0016, prandtl # = 1, grashof # = 1
> > Number of SNES iterations = 2
> > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2 MPI processes
> > See http://www.mcs.anl.gov/petsc/documentation/faq.html
> > …
>
> Surely it did not print … here? What was the complete message?
>
> Usually this happens due to an incompatible mpiexec being used because it appears earlier in the path.
>
> What happens if you do
>
> cd src/snes/examples/tutorials
> make ex19
> mpiexec -n 2 ./ex19
>
> for mpiexec use the one associated with the MPI that you built PETSc with
>
> Barry
>
> > ==================
> >
> > I tried to go into src/snes/examples/tutorials/ex19, but didn't find any logs. Is this error for the test Okay?
> >
> > Thanks,
> >
> > Kai
> >
> >
> >
> > On Fri, Jul 25, 2014 at 11:06 AM, Satish Balay <balay at mcs.anl.gov> wrote:
> > Its probably best to do a fresh build with the latest petsc-3.5.1 tarball and see
> > if the problem is reproduceable [with --with-shared-libraries=0]
> >
> > Satish
> >
> > On Fri, 25 Jul 2014, Jed Brown wrote:
> >
> > > Matthew Knepley <knepley at gmail.com> writes:
> > >
> > > > On Thu, Jul 24, 2014 at 5:33 PM, Kai Song <ksong at lbl.gov> wrote:
> > > >
> > > >> Hi Satish,
> > > >>
> > > >> Thanks for the quick response. I attached the make.log in this email, and
> > > >> a quick glance, I didn't see any obvious errors.
> > > >>
> > > >
> > > > This never built the MUMPS directory, even though you have PETSC_HAVE_MUMPS
> > > > defined. I can see two
> > > > possible reasons for this:
> > > >
> > > > 1) You distribution as missing files
> > > >
> > > > 2) You have two copies of PETSc fighting each other here (old PETSC_DIR?)
> > > >
> > > > Can you make sure you have the file mumps.c in
> > > >
> > > > src/mat/impls/aij/mpi/mumps
> > > >
> > > > Jed, how do we check what make is doing here?
> > >
> > > mumps.c should be listed in $PETSC_ARCH/conf/files after the code
> > > generator runs. Possibly a file system problem or tinkering caused it
> > > to appear up-to-date after reconfiguring? (Just speculating because we
> > > haven't seen this problem before.)
> > >
> >
> >
> >
> >
> > --
> > Kai Song
> > <ksong at lbl.gov> 1.510.495.2180
> > 1 Cyclotron Rd. Berkeley, CA94720, MS-50B 3209
> > High Performance Computing Services (HPCS)
> > Lawrence Berkeley National Laboratory - http://scs.lbl.gov
>
>
>
>
> --
> Kai Song
> <ksong at lbl.gov> 1.510.495.2180
> 1 Cyclotron Rd. Berkeley, CA94720, MS-50B 3209
> High Performance Computing Services (HPCS)
> Lawrence Berkeley National Laboratory - http://scs.lbl.gov
More information about the petsc-users
mailing list