[petsc-users] How to use petsc in a dynamically loaded shared library?
Florian
flo.44 at gmx.de
Wed Jul 18 14:12:38 CDT 2012
Am Mittwoch, den 18.07.2012, 09:15 -0500 schrieb Jed Brown:
> On Wed, Jul 18, 2012 at 8:52 AM, Florian Beck <Flo.44 at gmx.de> wrote:
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
> Violation, probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or
> -on_error_attach_debugger
> [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
> [0]PETSC ERROR: configure using --with-debugging=yes,
> recompile, link, and run
> [0]PETSC ERROR: to get more information on the crash.
> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [0]PETSC ERROR: Signal received!
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 3, Fri Jun
> 4 15:34:52 CDT 2010
> [0]PETSC ERROR: See docs/changes/index.html for recent
> updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble
> shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: -no_signal_handler,--with-dynamic-loading on a
> linux-gnu named riemann by beck Wed Jul 18 15:41:20 2012
> [0]PETSC ERROR: Libraries linked
> from /home/hazelsct/repositories/petsc/linux-gnu-c-opt/lib
> [0]PETSC ERROR: Configure run at Wed Aug 4 15:00:14 2010
> [0]PETSC ERROR: Configure options --with-shared
> --with-debugging=0 --useThreads 0 --with-clanguage=C++
> --with-c-support --with-fortran-interfaces=1
> --with-mpi-dir=/usr/lib/openmpi --with-mpi-shared=1
> --with-blas-lib=-lblas --with-lapack-lib=-llapack
> --with-umfpack=1
> --with-umfpack-include=/usr/include/suitesparse
> --with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]" --with-spooles=1 --with-spooles-include=/usr/include/spooles --with-spooles-lib=/usr/lib/libspooles.so --with-hypre=1 --with-hypre-dir=/usr --with-scotch=1 --with-scotch-include=/usr/include/scotch --with-scotch-lib=/usr/lib/libscotch.so --with-hdf5=1 --with-hdf5-dir=/usr
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: User provided function() line 0 in unknown
> directory unknown file
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 59.
>
>
> This is in your code. Run in a debugger to find out what's wrong.
>
>
I have used ddd and when I step into the VecDestroy function I get the
signal 11. I have three Vectors and it's only possible to destroy one of
them. Do I have consider something special before I destroy them? I read
values from the Vector which I'm able to destroy.
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI
> processes.
> You may or may not see output from other processes, depending
> on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
>
>
> > > Of course I have a memory leak, because I'm not using the
> functions to
> > > destroy my vectors. Is there a simple example how to use
> the
> > petsc-library
> > > in a program like the following pseudo-code:
> > >
> >
> > Is MPI initialized before this is called? Did you plan to do
> this in
> > parallel? Are you linking PETSc dynamically (as in, you
> dlopen and dlsym
> > PETSc functions to call them, or perhaps you declare weak
> symbols in your
> > code), linking your app-specific solver module (you call
> PETSc normally
> > and
> > use dlsym at a higher level), or something else? Remember to
> configure
> > PETSc --with-dynamic-loading if necessary.
>
>
> I plan to use it parallel, but first I want to calculate
> serial. I'm using dlopen to link my library.
>
> What happens if I call the PetsInitalize function several
> times? I call it in every function call.
>
> You can call it multiple times, but MPI_Init() can only be called
> once. We usually recommend that people only call PetscInitialize once
> (the logging/profiling/debugging infrastructure is more useful that
> way).
More information about the petsc-users
mailing list