[petsc-users] How to use petsc in a dynamically loaded shared library?

Florian Beck Flo.44 at gmx.de
Wed Jul 18 08:52:24 CDT 2012


Hi,


> On Wed, Jul 18, 2012 at 5:50 AM, Florian Beck <Flo.44 at gmx.de> wrote:
> 
> > Hi,
> >
> > I want to use the petsc library in a shared library which I'm
> dynamically
> > loading in my main program. Therefore I'm not using the functions to
> > destroy a Vector such as VecDestroy. If I use the function I got an
> > segmentation fault error.
> 
> 
> Send the full error message (including stack trace).

[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run 
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 3, Fri Jun  4 15:34:52 CDT 2010
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: -no_signal_handler,--with-dynamic-loading on a linux-gnu named riemann by beck Wed Jul 18 15:41:20 2012
[0]PETSC ERROR: Libraries linked from /home/hazelsct/repositories/petsc/linux-gnu-c-opt/lib
[0]PETSC ERROR: Configure run at Wed Aug  4 15:00:14 2010
[0]PETSC ERROR: Configure options --with-shared --with-debugging=0 --useThreads 0 --with-clanguage=C++ --with-c-support --with-fortran-interfaces=1 --with-mpi-dir=/usr/lib/openmpi --with-mpi-shared=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-umfpack=1 --with-umfpack-include=/usr/include/suitesparse --with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]" --with-spooles=1 --with-spooles-include=/usr/include/spooles --with-spooles-lib=/usr/lib/libspooles.so --with-hypre=1 --with-hypre-dir=/usr --with-scotch=1 --with-scotch-include=/usr/include/scotch --with-scotch-lib=/usr/lib/libscotch.so --with-hdf5=1 --with-hdf5-dir=/usr
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------


> > Of course I have a memory leak, because I'm not using the functions to
> > destroy my vectors. Is there a simple example how to use the
> petsc-library
> > in a program like the following pseudo-code:
> >
> 
> Is MPI initialized before this is called? Did you plan to do this in
> parallel? Are you linking PETSc dynamically (as in, you dlopen and dlsym
> PETSc functions to call them, or perhaps you declare weak symbols in your
> code), linking your app-specific solver module (you call PETSc normally
> and
> use dlsym at a higher level), or something else? Remember to configure
> PETSc --with-dynamic-loading if necessary.

I plan to use it parallel, but first I want to calculate serial. I'm using dlopen to link my library. 

What happens if I call the PetsInitalize function several times? I call it in every function call.




> 
> The best way is to reuse data structures, but if you are going to destroy
> them each iteration, make sure you destroy all of them. Note that MPI
> cannot be initialized more than once, but presumably you aren't doing that
> because the rest of the app needs MPI to formulate the original problem.
> 
> Note that doing everything dynamic is more work and defers more errors
> from
> compile and link time to run time. It is possible, but it takes more
> effort
> and some familiarity with requirements for dynamic loading.
> 
> 
> >
> > main{
> >
> >  for 1 to 10
> >
> >    do_something
> >
> >    call function_to_solve_Ax=b_with_petsc
> >
> >    do_something
> >
> >  end
> >
> > }
> >


More information about the petsc-users mailing list