[petsc-users] Compiling petsc with a user-defined MUMPS directory
Natacha BEREUX
natacha.bereux at gmail.com
Tue Apr 19 12:39:27 CDT 2016
Yes, we have a custom MUMPS version (because my company, EDF, belongs to
MUMPS consortium). Thereby, we get some patches and functionnalities in
advance with respect to the public version.
The option --download-mumps also works fine (I only have to define the path
to a local archive of sources). Our build does not use extra options.
We prefer to compile MUMPS alone, and in a second time build a compatible
PETSc library.
In fact we aim to use MUMPS in two ways :
- direct call of MUMPS driver (this is the legacy way)
- through PETSc interface (this is what I am experiencing)
Best regards
Natacha
On Tue, Apr 19, 2016 at 6:37 PM, Satish Balay <balay at mcs.anl.gov> wrote:
> I see ptscotch sources includes scotch aswell.
>
> BTW: What is the reason for installing mumps separately? Does it have
> some patches you are experimeinting with? Is it built with extra
> options that --download-mumps does not enable? Something else?
>
> thanks,
> Satish
>
> On Tue, 19 Apr 2016, Satish Balay wrote:
>
> > Natacha,
> >
> > Are you using mumps serially or parallely?
> >
> > My understanding is - MUMPS - when used parallely requires either
> > parmetis or ptscotch.
> >
> > And when used sequentially - it does not require anything [i.e
> > metis/scotch might be optional]
> >
> > Currently PETSc configure is trying to detect/use mumps parallely -
> > hence its insisting on parmetis or ptscotch.
> >
> > Assuming ptscotch doesn't conflict with scotch [I don't know if
> > ptscotch sources also includes scotch scources or not] - its fine to
> > use --download-ptscotch as a workarround. [or --download-parmetis -
> > assuming it can build with your install of metis]
> >
> > Ideally you should not need a workarround - hence my asking for
> > specifics of usage - to see if our configure check for mumps needs
> > fixing.
> >
> > Wrt specifying scotch or metis - PETSc configure supports metis - but
> > not scotch. So --with-metis-lib/--with-metis-include should work -
> > but not --with-scotch-lib/--with-scotch-include
> >
> > But its fine to lump both into mumps options - as they are primarily
> > dependencies of mumps anyway.
> >
> > Satish
> >
> > On Tue, 19 Apr 2016, Natacha BEREUX wrote:
> >
> > > Hello,
> > > Thanks a lot for your explanations.
> > > We have a "home-made" install of MUMPS, built with metis and scotch
> (and -
> > > at the moment- neither with parmetis nor with ptscotch).
> > >
> > > When I use --with-mumps-lib/--with-mumps-include, I am forced to
> specify
> > > scotch and metis libs inside mumps libs (otherwise the link of the
> check
> > > program dmumps_c fails). And this occurs even if I have already
> defined the
> > > location of scotch libraries through
> > > --with-scotch-lib/--with-scotch-include and
> > > --with-metis-lib/--with-metis-include options.
> > > Defining PETSc interfaces to scotch and metis packages is (at least for
> > > me) not sufficient : I have to specify these libraries locations in
> > > --with-mumps-lib. (as shown in the configure.log).
> > >
> > > This is not really a problem and doing so fixes the configure step.
> > >
> > > I agree that duplicating scotch to define ptscotch is weird. I did so
> > > because we do not have a parallel built of scotch.
> > > From PETSc point of view, MUMPS requires either Parmetis or PTScotch :
> so I
> > > provided ptscotch through --download-ptscotch option.
> > > I do not intend to use it, that is only for configuring purpose !
> > >
> > > Natacha
> > >
> > > On Mon, Apr 18, 2016 at 5:52 PM, Satish Balay <balay at mcs.anl.gov>
> wrote:
> > >
> > > > Same with parmetis.
> > > >
> > > > On Mon, 18 Apr 2016, Natacha BEREUX wrote:
> > > >
> > > > > Hello Satish,
> > > > > thank you very much for yor advices. They were very helpful !
> > > > >
> > > > > The configure step finally succeeds if I use the following
> configure
> > > > line:
> > > > > ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90
> > > > --with-mpi=1
> > > > > --with-debugging=0 --PETSC_ARCH=linux-metis-mumps
> > > > > --with-scalapack-lib="-lscalapack-openmpi -lblacs-openmpi
> > > > > -lblacsF77init-openmpi -lblacsCinit-openmpi"
> > > > >
> > > >
> --with-metis-lib="-L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites/Metis_aster-510_aster/lib
> > > > > -lmetis -lGKlib"
> > > > >
> > > >
> --with-metis-include=/home/H03755/dev/codeaster-prerequisites/v13/prerequisites/Metis_aster-510_aster/include
> > > > >
> > > >
> --with-mumps-lib="-L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites/Mumps-501_consortium_aster5/MPI/lib
> > > > > -lzmumps -ldmumps -lmumps_common -lpord
> > > > >
> > > >
> -L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites/Scotch_aster-604_aster1/lib
> > > > > -lesmumps -lscotch -lscotcherr -lscotcherrexit
> > > > >
> > > >
> -L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites/Metis_aster-510_aster/lib
> > > > > -lmetis"
> > > > >
> > > >
> --with-mumps-include=/home/H03755/dev/codeaster-prerequisites/v13/prerequisites/Mumps-501_consortium_aster5/MPI/include
> > > > > --with-blas-lapack-lib="-llapack -lopenblas"
> > > > > --download-ptscotch=/home/H03755/Librairies/scotch_6.0.3.tar.gz
> > > > LIBS=-lgomp
> > > > >
> > > > > I have to specify scotch shared libraries (-lscotch -lscotcherr
> > > > > -lscotcherrexit ) and metis shared library in --with-mumps-lib
> option
> > > > > otherwise the test (on dmump) in configure fails.
> > > > > Is it OK to do so ?
> > > >
> > > > Hm - its best to avoid duplication of libraries.
> > > >
> > > > i.e specifying scotch via mumps-libs and also via
> --download-ptscotch will
> > > > cause problems.
> > > >
> > > > Why not specify scotch with
> --with-ptscotch-include,--with-ptscotch-lib
> > > > options?
> > > >
> > > > >
> > > > > I also tried to use the following shorter line
> > > > >
> > > > > ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90
> > > > > --with-mpi=1 --with-debugging=0 --PETSC_ARCH=linux-mumps-
> > > > > --with-scalapack-lib="-lscalapack-openmpi -lblacs-openmpi
> > > > > -lblacsF77init-openmpi -lblacsCinit-openmpi"
> > > > >
> > > >
> --with-mumps-lib="-L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites/Mumps-501_consortium_aster5/MPI/lib
> > > > > -lzmumps -ldmumps -lmumps_common -lpord
> > > > >
> > > >
> -L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites/Scotch_aster-604_aster1/lib
> > > > > -lesmumps -lscotch -lscotcherr -lscotcherrexit
> > > > >
> > > >
> -L/home/H03755/dev/codeaster-prerequisites/v13/prerequisites/Metis_aster-510_aster/lib
> > > > > -lmetis"
> > > > >
> > > >
> --with-mumps-include=/home/H03755/dev/codeaster-prerequisites/v13/prerequisites/Mumps-501_consortium_aster5/MPI/include
> > > > > --with-blas-lapack-lib="-llapack -lopenblas"
> > > > > --download-ptscotch=/home/H03755/Librairies/scotch_6.0.3.tar.gz
> > > > LIBS=-lgomp
> > > > >
> > > > > I do not use --with-metis-lib/--with-metis-include.
> > > > > I wonder if this is authorized : the libraries are given with the
> > > > > --with-mumps-lib, but the include are not defined.
> > > >
> > > > This is fine. [except for the dupliation of ptscotch]
> > > >
> > > > >
> > > > > What is the good practice ?
> > > >
> > > > Either is work - but its best to specify each package with its own
> > > > options listed by configure. The issues usually are:
> > > >
> > > > - Is this package primarily a depencency of an externalpakage or if
> there
> > > > is a petsc interface to it?
> > > >
> > > > For ex: PETSc has interface to mumps, parmetis - but not scalapack.
> So
> > > > if you club parmetis into mumps-libs then petsc interface to parmetis
> > > > would not be enabled. If you are not using this feature - it doesn't
> > > > matter if its not enabled.
> > > >
> > > > - does this externalpakage also require the includes in the public
> > > > interface?
> > > >
> > > > If dmumps_c.h requires metis.h [when mumps is built with metis]
> > > > - then you might have to specify metis include also with
> > > > --with-mumps-include. Otherwise - it doesn't matter.
> > > >
> > > > - are there bugs in petsc configure that can trigger wrong error
> checks?
> > > >
> > > > Because mumps depends on scalapack, and optionally on
> > > > metis,parmetis,ptscotch [i.e only one of them is required - but not
> > > > all] - there is an error-check in configure to make sure atleast one
> > > > of them is specified for --download-mumps. Perhaps this check should
> > > > not trigger error for user built mumps.
> > > >
> > > > Did you build MUMPS with both metis, ptscotch? [and not parmetis?]
> > > >
> > > > Satish
> > > >
> > > > >
> > > > > Best regards
> > > > > Natacha
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > > On Thu, Apr 14, 2016 at 6:07 PM, Satish Balay <balay at mcs.anl.gov>
> wrote:
> > > > >
> > > > > > you'll have to roll-in the --with-blacs-lib option into
> > > > > > --with-scalapack-lib option
> > > > > >
> > > > > > Satish
> > > > > >
> > > > > > On Thu, 14 Apr 2016, Natacha BEREUX wrote:
> > > > > >
> > > > > > > Sorry, do not take into account my last email.
> > > > > > > I made some progress and I am now able to configure PETSc
> with a
> > > > > > > pre-installed version of metis.
> > > > > > >
> > > > > > > Problems come when I try to configure PETSc with MUMPS
> > > > > > >
> > > > > > > My command line is
> > > > > > > ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90
> > > > > > > --with-ssl=0 --with-mpi=1 --with-debugging=1
> > > > > > > --PETSC_ARCH=linux-metis-mumps
> > > > > > > --with-scalapack-lib=/usr/lib/libscalapack-openmpi.so
> > > > > > >
> > > > > >
> > > >
> --with-blacs-lib=[/usr/lib/libblacs-openmpi.so,/usr/lib/libblacsCinit-openmpi.so,/usr/lib/libblacsF77init-openmpi.so]
> > > > > > >
> > > > > >
> > > >
> --with-metis-lib=[${METIS_PRE}/lib/libmetis.a,${METIS_PRE}/lib/libGKlib.a]
> > > > > > > --with-metis-include=$METIS_PRE/include
> > > > > > >
> > > > > >
> > > >
> --with-mumps-lib=[$MUMPS_PRE/lib/libdmumps.a,$MUMPS_PRE/lib/libmumps_common.a,$MUMPS_PRE/lib/libpord.a]
> > > > > > > --with-mumps-include=$MUMPS_PRE/include
> > > > > > >
> > > > > > > where METIS_PRE and MUMPS_PRE are the path to the local
> installs of
> > > > metis
> > > > > > > and mumps)
> > > > > > >
> > > > > > > I get (at least) the following error
> > > > > > > /libdmumps.a(dend_driver.o): undefined reference to symbol
> > > > > > 'blacs_gridexit_'
> > > > > > > /usr/lib/libblacs-openmpi.so.1: error adding symbols: DSO
> missing
> > > > from
> > > > > > > command line
> > > > > > > collect2: error: ld returned 1 exit status
> > > > > > >
> > > > > > >
> > > > > > > Would you have any idea of its meaning ?
> > > > > > >
> > > > > > > The configure.log is attached
> > > > > > > Thanks a lot if you can help me !
> > > > > > > Natacha
> > > > > > >
> > > > > > > On Thu, Apr 14, 2016 at 5:19 PM, Natacha BEREUX <
> > > > > > natacha.bereux at gmail.com>
> > > > > > > wrote:
> > > > > > >
> > > > > > > > Hi Satish
> > > > > > > > thanks a lot for the answer. Unfortunately, it does not work
> yet.
> > > > > > > > More precisely :
> > > > > > > > --download-mumps works fine (and every --download-package
> option
> > > > works
> > > > > > > > perfectly). I am then able to compile a PETSc library.
> > > > > > > > --with-package-lib=/usr/lib/libscalapack-openmpi.so or more
> > > > generally
> > > > > > > > --with-package-lib=libXXXX.so also works
> > > > > > > >
> > > > > > > > But I would like to use static librairies, preinstalled on
> my
> > > > computer
> > > > > > > > ... and this fails.
> > > > > > > >
> > > > > > > > For the moment I gave up compiling with MUMPS, and I am
> instead
> > > > > > trying to
> > > > > > > > compile with Metis 5.
> > > > > > > > I have a preinstalled version in a some directory lets say
> > > > metis_dir
> > > > > > > > I try
> > > > > > > > -with-metis-lib=[metis_dir/lib/libmetis.a,
> > > > metis_dir/lib/libGKlib.a]
> > > > > > > > --with-metis-include=metis_dir/include
> > > > > > > > this fails (see the attached config.log)
> > > > > > > > -with-metis-dir=metis_dir also fails
> > > > > > > > Is there a problem with static librairies ?
> > > > > > > >
> > > > > > > > Natacha
> > > > > > > >
> > > > > > > >
> > > > > > > > On Tue, Apr 12, 2016 at 6:19 PM, Satish Balay <
> balay at mcs.anl.gov>
> > > > > > wrote:
> > > > > > > >
> > > > > > > >> On Tue, 12 Apr 2016, Natacha BEREUX wrote:
> > > > > > > >>
> > > > > > > >> > Hello,
> > > > > > > >> > I am trying to compile Petsc (3.6.3) with external
> packages
> > > > (MUMPS
> > > > > > and
> > > > > > > >> its
> > > > > > > >> > prerequisites).
> > > > > > > >> > More precisely I would like PETSc to use a pre-installed
> > > > version of
> > > > > > > >> MUMPS.
> > > > > > > >> >
> > > > > > > >> > Petsc downloads and compiles the prerequisites (parmetis,
> > > > scalapack
> > > > > > > >> etc) :
> > > > > > > >> > this works fine.
> > > > > > > >>
> > > > > > > >> What metis/parmetis/slcalapack is this MUMPS installed with?
> > > > > > > >>
> > > > > > > >> What version of MUMPS did you install?
> > > > > > > >>
> > > > > > > >> Why could you not use --download-mumps?
> > > > > > > >>
> > > > > > > >> Using a different metis/parmetis/slcalapack to install
> MUMPS - and
> > > > > > > >> then specifying --download-metis --download-parmetis
> > > > > > > >> --download-scalapack [i.e different versions/builds of the
> same
> > > > > > > >> libraries] can result in conflcits.
> > > > > > > >>
> > > > > > > >> >
> > > > > > > >> > I define MUMPS location by --with-mumps-dir=top-directory
> of
> > > > MUMPS
> > > > > > > >> install,
> > > > > > > >> > but the
> > > > > > > >> > configure step fails with the followiing message:
> > > > > > > >> >
> > > > > > > >> > UNABLE to CONFIGURE with GIVEN OPTIONS (see
> configure.log for
> > > > > > > >> details):
> > > > > > > >> > --with-mumps-dir=/home/H03755/Librairies/Mumps_MPI did
> not work
> > > > > > > >> >
> > > > > > > >> > I do not understand what is wrong.
> > > > > > > >> > I have attached the configure.log file.
> > > > > > > >> >
> > > > > > > >> > Any hint would be greatly appreciated !
> > > > > > > >>
> > > > > > > >> >>>>
> > > > > > > >> Executing: mpicc -show
> > > > > > > >> stdout: gcc -I/usr/lib/openmpi/include
> > > > > > -I/usr/lib/openmpi/include/openmpi
> > > > > > > >> -pthread -L/usr//lib -L/usr/lib/openmpi/lib -lmpi -ldl
> -lhwloc
> > > > > > > >> Defined make macro "MPICC_SHOW" to "gcc
> > > > > > > >> -I/usr/lib/openmpi/include
> -I/usr/lib/openmpi/include/openmpi
> > > > -pthread
> > > > > > > >> -L/usr//lib -L/usr/lib/openmpi/lib -lmpi -ldl -lhwloc"
> > > > > > > >> <<<<
> > > > > > > >> Ok - so you are using system openmpi with gcc.
> > > > > > > >>
> > > > > > > >>
> > > > > > > >> >>>>
> > > > > > > >> Executing: mpicc -o
> /tmp/petsc-0u_4WI/config.libraries/conftest
> > > > > > -fPIC
> > > > > > > >> -Wall -Wwrite-strings -Wno-strict-aliasing
> -Wno-unknown-pragmas -O
> > > > > > > >> /tmp/petsc-0u_4WI/config.libraries/conftest.o
> > > > > > > >> -Wl,-rpath,/home/H03755/Librairies/Mumps_MPI/lib
> > > > > > > >> -L/home/H03755/Librairies/Mumps_MPI/lib -lcmumps -ldmumps
> -lsmumps
> > > > > > -lzmumps
> > > > > > > >> -lmumps_common -lpord
> > > > > > > >>
> > > > > >
> > > >
> -Wl,-rpath,/home/H03755/Librairies/petsc-3.6.3/linux-debug-mumps-ext/lib
> > > > > > > >>
> -L/home/H03755/Librairies/petsc-3.6.3/linux-debug-mumps-ext/lib
> > > > > > -lscalapack
> > > > > > > >> -llapack -lblas -Wl,-rpath,/usr/lib/openmpi/lib
> > > > -L/usr/lib/openmpi/lib
> > > > > > > >> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.9
> > > > > > > >> -L/usr/lib/gcc/x86_64-linux-gnu/4.9
> > > > > > -Wl,-rpath,/usr/lib/x86_64-linux-gnu
> > > > > > > >> -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu
> > > > > > > >> -L/lib/x86_64-linux-gnu -lmpi_f90 -lmpi_f77 -lgfortran -lm
> > > > > > > >> -Wl,-rpath,/usr/lib/openmpi/lib
> > > > > > > >> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.9
> > > > > > > >> -Wl,-rpath,/usr/lib/x86_64-linux-gnu
> > > > -Wl,-rpath,/lib/x86_64-linux-gnu
> > > > > > > >> -lgfortran -lm -lquadmath -lm -llapack -lblas
> > > > > > > >> -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib
> > > > > > > >> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.9
> > > > > > > >> -L/usr/lib/gcc/x86_64-linux-gnu/4.9
> > > > > > -Wl,-rpath,/usr/lib/x86_64-linux-gnu
> > > > > > > >> -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu
> > > > > > > >> -L/lib/x86_64-linux-gnu -lmpi_f90 -lmpi_f77 -lgfortran -lm
> > > > > > > >> -Wl,-rpath,/usr/lib/openmpi/lib
> > > > > > > >> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.9
> > > > > > > >> -Wl,-rpath,/usr/lib/x86_64-linux-gnu
> > > > -Wl,-rpath,/lib/x86_64-linux-gnu
> > > > > > > >> -lgfortran -lm -lquadmath -lm
> > > > > > > >>
> > > > > >
> > > >
> -Wl,-rpath,/home/H03755/Librairies/petsc-3.6.3/linux-debug-mumps-ext/lib
> > > > > > > >>
> -L/home/H03755/Librairies/petsc-3.6.3/linux-debug-mumps-ext/lib
> > > > > > -lparmetis
> > > > > > > >>
> > > > > >
> > > >
> -Wl,-rpath,/home/H03755/Librairies/petsc-3.6.3/linux-debug-mumps-ext/lib
> > > > > > > >>
> -L/home/H03755/Librairies/petsc-3.6.3/linux-debug-mumps-ext/lib
> > > > > > -lmetis -lm
> > > > > > > >> -lm -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib
> > > > > > > >> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.9
> > > > > > > >> -L/usr/lib/gcc/x86_64-linux-gnu/4.9
> > > > > > -Wl,-rpath,/usr/lib/x86_64-linux-gnu
> > > > > > > >> -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu
> > > > > > > >> -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu
> > > > > > > >> -L/usr/lib/x86_64-linux-gnu -ldl -lmpi -lhwloc -lgcc_s
> -lpthread
> > > > -ldl
> > > > > > > >> Possible ERROR while running linker: exit code 256
> > > > > > > >> stderr:
> > > > > > > >>
> /home/H03755/Librairies/Mumps_MPI/lib/libdmumps.a(dlr_stats.o): In
> > > > > > > >> function
> `__dmumps_lr_stats_MOD_update_flop_stats_lrb_product':
> > > > > > > >> dlr_stats.F:(.text+0x3079): undefined reference to
> > > > > > > >> `GOMP_critical_name_start'
> > > > > > > >> dlr_stats.F:(.text+0x30fa): undefined reference to
> > > > > > > >> `GOMP_critical_name_end'
> > > > > > > >> dlr_stats.F:(.text+0x310e): undefined reference to
> > > > > > > >> `GOMP_critical_name_start'
> > > > > > > >> dlr_stats.F:(.text+0x318f): undefined reference to
> > > > > > > >> `GOMP_critical_name_end'
> > > > > > > >>
> /home/H03755/Librairies/Mumps_MPI/lib/libdmumps.a(dlr_stats.o): In
> > > > > > > >> function `__dmumps_lr_stats_MOD_update_flop_stats_trsm':
> > > > > > > >> dlr_stats.F:(.text+0x33a9): undefined reference to
> > > > > > > >> `GOMP_critical_name_start'
> > > > > > > >> dlr_stats.F:(.text+0x33f9): undefined reference to
> > > > > > > >> `GOMP_critical_name_end'
> > > > > > > >> dlr_stats.F:(.text+0x340a): undefined reference to
> > > > > > > >> `GOMP_critical_name_start'
> > > > > > > >> dlr_stats.F:(.text+0x345a): undefined reference to
> > > > > > > >> `GOMP_critical_name_end'
> > > > > > > >>
> /home/H03755/Librairies/Mumps_MPI/lib/libdmumps.a(dlr_stats.o): In
> > > > > > > >> function `__dmumps_lr_stats_MOD_update_flop_stats_panel':
> > > > > > > >> dlr_stats.F:(.text+0x3576): undefined reference to
> > > > > > > >> `GOMP_critical_name_start'
> > > > > > > >> dlr_stats.F:(.text+0x35a7): undefined reference to
> > > > > > > >> `GOMP_critical_name_end'
> > > > > > > >> dlr_stats.F:(.text+0x35b8): undefined reference to
> > > > > > > >> `GOMP_critical_name_start'
> > > > > > > >> dlr_stats.F:(.text+0x35e9): undefined reference to
> > > > > > > >> `GOMP_critical_name_end'
> > > > > > > >>
> /home/H03755/Librairies/Mumps_MPI/lib/libdmumps.a(dlr_stats.o): In
> > > > > > > >> function `__dmumps_lr_stats_MOD_update_flop_stats_demote':
> > > > > > > >> dlr_stats.F:(.text+0x36ac): undefined reference to
> > > > > > > >> `GOMP_critical_name_start'
> > > > > > > >> dlr_stats.F:(.text+0x36ce): undefined reference to
> > > > > > > >> `GOMP_critical_name_end'
> > > > > > > >> dlr_stats.F:(.text+0x36df): undefined reference to
> > > > > > > >> `GOMP_critical_name_start'
> > > > > > > >> dlr_stats.F:(.text+0x3701): undefined reference to
> > > > > > > >> `GOMP_critical_name_end'
> > > > > > > >>
> /home/H03755/Librairies/Mumps_MPI/lib/libdmumps.a(dlr_stats.o): In
> > > > > > > >> function
> `__dmumps_lr_stats_MOD_update_flop_stats_cb_demote':
> > > > > > > >> dlr_stats.F:(.text+0x37c1): undefined reference to
> > > > > > > >> `GOMP_critical_name_start'
> > > > > > > >> dlr_stats.F:(.text+0x37e3): undefined reference to
> > > > > > > >> `GOMP_critical_name_end'
> > > > > > > >>
> /home/H03755/Librairies/Mumps_MPI/lib/libdmumps.a(dlr_stats.o): In
> > > > > > > >> function
> `__dmumps_lr_stats_MOD_update_flop_stats_cb_promote':
> > > > > > > >> dlr_stats.F:(.text+0x3839): undefined reference to
> > > > > > > >> `GOMP_critical_name_start'
> > > > > > > >> dlr_stats.F:(.text+0x3856): undefined reference to
> > > > > > > >> `GOMP_critical_name_end'
> > > > > > > >>
> /home/H03755/Librairies/Mumps_MPI/lib/libdmumps.a(dana_lr.o): In
> > > > > > function
> > > > > > > >> `__dmumps_ana_lr_MOD_mumps_scotch_kway':
> > > > > > > >> dana_lr.F:(.text+0x115): undefined reference to
> > > > `scotchfgraphbuild_'
> > > > > > > >> dana_lr.F:(.text+0x131): undefined reference to
> > > > `scotchfstratinit_'
> > > > > > > >> dana_lr.F:(.text+0x151): undefined reference to
> > > > `scotchfgraphpart_'
> > > > > > > >> dana_lr.F:(.text+0x15e): undefined reference to
> > > > `scotchfstratexit_'
> > > > > > > >> dana_lr.F:(.text+0x16b): undefined reference to
> > > > `scotchfgraphexit_'
> > > > > > > >> <snip>
> > > > > > > >>
> > > > > > > >>
> > > > > > > >> Looks like this MUMPS is built with ptscotch and openmp.
> > > > > > > >>
> > > > > > > >> You can specify -lgomp for openmp. This can be done with the
> > > > configure
> > > > > > > >> option
> > > > > > > >> LIBS=-lgomp
> > > > > > > >>
> > > > > > > >> Wrt and PTSCOTCH stuff depending upon how its installed
> we'll
> > > > have to
> > > > > > > >> figureout
> > > > > > > >> how to specify it.
> > > > > > > >>
> > > > > > > >> Its best to specify all the packages [mumps and its
> dependencies]
> > > > you
> > > > > > > >> built manually with the options:
> > > > > > > >>
> > > > > > > >> --with-package-include --with-package-lib
> > > > > > > >>
> > > > > > > >> Satish
> > > > > > > >>
> > > > > > > >> > Best regards,
> > > > > > > >> >
> > > > > > > >> > Natacha
> > > > > > > >> >
> > > > > > > >>
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > > > >
> > > > >
> > > >
> > > >
> > >
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160419/7e4a3867/attachment-0001.html>
More information about the petsc-users
mailing list